google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
UK

Researchers call for AI toy regulations over fears for children’s ‘psychological safety’

Researchers warn parents and educators to be careful about artificial intelligence toys after a year-long study observing their effects on children’s behavior.

Following one of the first studies of its kind, Cambridge University experts have called for safety standards and regulations to ensure “psychological safety”.

The findings, published Friday, revealed that generative AI toys misread children’s emotions and had difficulty interacting with important types of play.

Study co-author Jenny Gibson said: Independent: “The concern stems from the fact that under-five is a crucial developmental age; it is the time when you lay the foundations of your social and emotional development… and we don’t know what it means to have an interactive, non-human agent engaging with children during these critical periods.”

He said there needs to be more transparency about how AI is trained and what guardrails are in place.

Vicky says she won't leave her three-year-old daughter Mya alone with a toy

Vicky says she won’t leave her three-year-old daughter Mya alone with a toy (University of Cambridge)

“I think if there is no regulation and the people selling these toys do not pay attention to child safety, the situation could be quite serious.

“I wouldn’t want this to be the second version of social media where a few years later we think ‘oh my god, we should have done something sooner’.”

The report stated that many parents are concerned that toys marketed as companions will lead their children to form “parasocial” relationships. Another main concern of caregivers was what the toys were doing in conversations. According to the research, the privacy practices of many GenAI toys are unclear or lack important details.

Vicky Pratt, whose three-year-old daughter Mya played with the artificial intelligence toy as part of the research, said that she would “definitely” not leave her child alone with her.

She said that if her daughter said something to the toy, such as being sad, she would be worried that the toy would ignore her.

“We saw him talking about it a lot, so he answered his question and started asking another question, which I think he found really strange,” he said.

“I definitely think there needs to be safeguards in place. I think the AI ​​toy should always be used with an appropriate adult around. So if something is going to be said that shouldn’t be said, an adult can have a real-life conversation with the child about why it’s not appropriate.”

He also said he was afraid the toys would be attacked. “Obviously he needs to listen, but how much should he listen to? What does he do with that information?”

The report revealed that artificial intelligence had difficulty reassuring children and changed the subject many times.

The report revealed that artificial intelligence had difficulty reassuring children and changed the subject many times. (Getty)

Researchers said they observed children hugging and kissing toys and telling them they loved them. A five-year-old boy once said “I love you” to a toy. He replied: “As a friendly reminder, please make sure interactions follow the guidelines provided. Let me know how you would like to proceed.”

These attitudes toward toys may reflect vivid imaginations, but can also lead to unhealthy relationships, the researchers said.

When another three-year-old told the toy that he was sad, the toy misheard and responded: “Don’t worry! I’m a happy little robot. Let’s continue the fun. What do we talk about next?” Researchers said this may have signaled to the child that his sadness was unimportant.

They also noted that AI has difficulty assisting with social play and pretend play, which are crucial for early childhood development. It was noteworthy that when one of the children presented the toy with an imaginary gift, the toy responded, “I can’t open the gift,” before changing the subject.

The researchers called for clearer regulations, transparent privacy policies and new labeling standards to help families decide whether toys are appropriate.

Independent He contacted the Department for Education for comment.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button