Experts issue warning over people forming ‘emotional bonds’ with AI chatbots

AI chatbots are increasingly being used to make decisions, but experts warn they are also becoming a go-to for companionship and emotional support.
Almost half of adults in the UK report feeling lonely; one in ten people experience chronic loneliness; feels lonely “often or always”.
Given the large number of people struggling with loneliness, it’s no surprise that people are seeking alternative sources of companionship and emotional support. British Medical Journal (BMJ) said.
They added that there is “an alarming possibility that we may witness a generation learning to form emotional bonds with beings who lack the human-like capacity for empathy, care, and relational attunement.”
Dr. from Great Ormond Street Children’s Hospital. The report, written by Susan Shelmerdine and consultant psychiatrist Matthew Nour, highlighted the risks and benefits of using artificial intelligence to combat loneliness and called for studies to investigate the risks of human-chatbot interactions.

A study among teens found that a third of teens use AI friends for social interaction, with one in ten reporting that AI chats are more satisfying than human conversations, and a third would choose AI friends over humans for serious conversations.
Another study conducted by the charity Youth Endowment Fund (YEF) found that a quarter of young people in the UK turned to AI chatbots for mental health support in the last year.
The authors of the BMJ report suggested that clinicians should ask patients to find out whether their chatbot use is problematic, especially during holiday periods when vulnerable populations are most at risk, and then ask questions, if necessary, to assess compulsive use patterns, addiction, and emotional involvement.
However, the authors acknowledge that AI chatbots may benefit individuals experiencing loneliness by increasing accessibility and support.
Psychologists agree that AI can be useful when asked for suggestions on what to do when you’re alone, but it shouldn’t replace human support.
“Replacing our social environment and real-life opportunities with a chatbot is a frightening prospect for a young person’s emotional health,” said chartered psychologist Dr Audrey Tang. Independent.
“Social media and other forms of technology can simulate the company, but that is not the essential connection that many people desire or is not healthy in the long run.”
President of the British Psychological Association, Dr. “Artificial intelligence is not a magic wand,” warned Roman Raczka, especially when it comes to reducing mental health waiting lists.
“Artificial intelligence offers many benefits to society, but should not replace the human support required for mental health care. Instead, tools such as chatbots should be used to complement existing services that help those in need of mental health support,” Dr Raczka said.
“AI cannot replicate true human empathy and there is a risk of creating the illusion of connection rather than meaningful interaction. Concerns also remain about data privacy and the dangers of over-reliance on technology. However, when used appropriately, AI can offer an anonymous, judgment-free space that is accessible 24/7. This could be a useful addition to existing face-to-face mental health services.”




