Please Note: Only COVID-19 vaccinated adults and children over 5 can attend the Clinic.

Parenting in the Age of AI

Three years ago, few of us had heard of AI but with the public release of free ChatGPT from OpenAI at the end of 2022, this changed. A collection of similar tools rapidly followed, all capable of producing texts, images, audio, and video when requested through ordinary language prompts. Even the most hardened technophobe is now finding AI hard to ignore and parents are asking what this means for their children. Is it dangerous and if so, what are the risks?

A Report by the Australian Research Council

Tama Leaver and Suzanne Srdarov from Curtin University have produced a report addressing ‘nine of the most urgent challenges and issues in terms of everyday use of GenAI tools, especially when children might be using these systems.’ They urge educators, policy makers, NGOs, parents and ‘anyone thinking about the uses of GenAI tools’ to consider these issues.

Some Key Challenges

The authors begin by addressing language used to describe AI which suggests it is human like and capable of equivalent thought. This builds hype, fuelling fear that AI will develop consciousness and pose a threat to humanity. It also creates potential for misunderstanding and mistaking a chatbot as alive, real, and even a friend. They recommend a shift from terms like learning and imagining to those which more accurately reflect the idea of imperfect data and a dialogue with children about its role as a data set which can be fun, informative, and fallible.

Because data is gathered from mostly undisclosed sources including social media pages there is a greater potential for misrepresentation or bias about people and places. This is based on the aggregation of data that is mathematically weighed to assess the most statistically likely “answer” or output, with data from diverse or minority groups less likely to be available. The authors encourage users to ask, “What kinds of stories do these tools tend to produce, and who is rendered more or less visible in them?” Of particular concern is the misrepresentation or appropriation of indigenous data.

Additional issues include copyright, and the environmental costs of mining rare earth minerals to build data centres and the fact that a search driven by generative AI uses four to five times the energy of a conventional web search. While adults and older children are aware that AI characters with whom they interact are not real people young children may not. We need to teach them not to share private and personal information which could leave them vulnerable to exploitation. We also need to monitor the level of connection and attachment they may form to the exclusion of human relationships.

The authors note the speed with which AI tools are being introduced without attention to their limitations and current flaws. They conclude; ‘The challenges for parents, carers, educators and others who want to introduce younger children to technologies and devices won’t be how to turn GenAI tools on; the challenge will be to turn them off long enough for young people to develop the literacies needed to use them safely and appropriately.’

Leaver, T., Srdarov, S. (2025) Children and Generative AI (GenAI) in Australia: The Big Challenges. Australian Research Council Centre of Excellence for the Digital Child, Queensland University of Technology.

 

 

©Copyright Bower Place Pty. Ltd. 2025

Free weekly
director’s notes
  • This field is for validation purposes and should be left unchanged.

By subscribing you agree to receive marketing communications from Bower Place. You can unsubscribe at any time or contact us to have your details deleted from our database.