Doctors alarmed youngsters using AI chatbots to seek emotional support

Doctors and mental health experts have expressed serious concerns about increasing numbers of young people turning to artificial intelligence (AI) chatbots for emotional support.
Researchers from University College London say some young people are forming emotional bonds with artificial friends rather than people, putting them at risk of difficulty making lasting human connections.
Their warnings come as there is evidence that chatbots are increasingly being used not just for information but also for comfort, reassurance and even therapy.
Figures show that approximately 810 million users interact with ChatGPT every week, with friendship and emotional support among the most common reasons for use.
The findings set the backdrop for what experts have described as an epidemic of loneliness in the UK, where almost half of adults report feeling lonely and almost one in 10 report feeling lonely most of the time.
Writing in the British Medical Journal, researchers said: “Unlike real human interactions, chatbots offer unlimited availability and patience and are unlikely to present users with compelling counter-narratives.”
They added: “An alarming possibility is that we may be witnessing a generation learning to form emotional bonds with beings who, despite their seemingly conscious responses, lack the human capacity for empathy, care, and relational attunement.”
The team analyzed existing research on AI use and psychological harm.
A study by OpenAI covering more than 980 ChatGPT users found that those who spent the most time using the chatbot over the course of a month reported more loneliness and socialized less with other people.
Signs of emotional dependence were strongest among users who said they trusted the chatbot.
Another study by Common Sense Media found that one in 10 young people feel that conversations with AI are more satisfying than interacting with humans, while a third say they would choose an AI friend over a person for serious conversations.
Researchers emphasize that artificial intelligence systems should be designed to support human relationships rather than replace them.
They wrote: “Future systems may further benefit users by recognizing references to loneliness and encouraging users to seek support from friends or family, or by providing personalized guidance on accessing local services.”
Doctors are now required to ask patients directly about chatbot use.
“This should be followed by more directed questions to assess compulsive usage patterns and addiction, emotional engagement such as resorting to the AI chatbot as a friend and deferring to the chatbot for important decisions,” the authors said.
The warning follows tragic cases where reliance on AI has been linked to harm.
14-year-old Sewell Setze died by suicide in February after forming an intense relationship with a role-playing chatbot, according to his family, who are now taking legal action against the platform in question.
Experts say urgent research and measures are needed to prevent vulnerable young people from replacing human support with artificial empathy.
