AI chatbots might dob: Meta issues warning to teens

Teens using AI chatbots built into Instagram, Facebook and Messenger may be asked to explain themselves after Meta announced changes to parental controls.
The US social media giant announced that it will prepare reports about the topics young users ask Meta AI and share them with their parents and guardians, and will also issue warnings when young people test the technology for self-harm.
The security measure is one of several measures Meta has announced to control AI on its platforms and will be introduced in Australia, the US, the UK, Canada and Brazil in the coming months.
But the rules come a month after the eSafety Commission named Facebook and Instagram among five digital platforms for failing to take reasonable steps to prevent children under 16 from having accounts.
Failure to do so could result in fines of up to $49.5 million.
Meta announced its changes in a blog post, explaining that parents and guardians who supervise their children’s accounts will receive weekly summaries of the types of questions they ask the AI.
“Parents will be able to see the topics their teens asked Meta AI about on that app last week,” the company said in a statement.
“Topics can range from school, entertainment and lifestyle to travel, writing, health and wellbeing.”
Teens’ exact queries will remain private, but parents will be able to select more granular categories to see if their child is asking about fitness or mental health in the health category, for example.
Weekly topic reports will be added to proactive alerts sent to parents if teens repeatedly ask Meta AI questions about topics related to self-harm or suicide over a short period of time.
The company also said it will create an AI Wellbeing Expert Council with participants from three existing advisory groups to provide advice on young people’s use of technology.
The changes come at a time when the company’s treatment of young users is under intense scrutiny both in Australia and abroad.
A US judge ordered Meta to pay US$375 million ($524 million) in March after a jury found Meta misled users about the security of its platforms and allowed the sexual exploitation of children.
The company has vowed to appeal the decision.
Lifeline 13 11 14
Child Helpline 1800 55 1800 (for people aged 5 to 25)


