When you should—and shouldn’t—use ChatGPT as a therapist, from experts

As Americans get lonely and lonelyA growing number of people are getting emotional support from AI chatbots, and some mental health experts are concerned.
“Artificial intelligence for therapy [and] “Emotional support friendship comes up a lot,” he says. Leanna FortunatoHe is a licensed clinical psychologist and director of quality and healthcare innovation for the American Psychological Association. “Providers are talking about it, and we know from research that people are increasingly using AI tools for this type of support.”
Some chatbot users inadvertently engage in conversations about mental health, for example by complaining about a stressful day to a digital entity that is guaranteed to listen. Others may seek mental health advice from an AI chatbot This is not a licensed professionalBut cheaper than a therapistFortunato says.
In a health study questionnaire Of more than 20,000 U.S. adults, 10.3% of respondents said they use productive AI every day. Of this group, 87.1% reported using technology for personal reasons, such as advice and emotional support. The study was published Jan. 21 and was conducted by researchers from institutions including Massachusetts General Hospital, Weill Cornell Medicine and Northeastern University.
Search term on TikTok “Therapy Artificial Intelligence Bot” has at least 11.5 million posts, from users sharing their best advice for turning chatbots into therapists to health experts warning about the potential dangers involved.
There are technology companies We spend billions of dollars Developing artificial intelligence tools and trying to further integrate them into people’s daily lives. But historically artificial intelligence chatbots I don’t always understand When a user experiences a serious health crisis and does not always respond appropriately. The New York Times identified “nearly 50 cases of mental health crises during conversations with ChatGPT”; among them were three deaths. 23 November report.
like companies anthropic, Google and ChatGPT builder OpenAI They say they are working with mental health experts to strengthen their tools’ responses to sensitive conversations. “These are incredibly heartbreaking situations and our thoughts are with those affected,” an OpenAI spokesperson told CNBC Make It. “We continue to improve ChatGPT’s training in recognizing and responding to signs of distress, de-escalating conversations in vulnerable moments, and directing people to real-world support by working closely with mental health clinicians and experts.”
Frequent conversations with artificial intelligence friends can erode people’s real-life social skills. An article dated April 2025 Written by OpenAI product policy researcher. Heavy daily use of ChatGPT was found to be associated with increased loneliness OpenAI-MIT Media Lab study also released in April 2025.
American Psychological Association strongly not recommended Using AI as a replacement for therapy and mental health support.
Some mental health experts say you can interact with chatbots on certain topics without risk. Here’s what you need to know.
‘I see this as a tool and I think a tool can be useful’
Psychotherapist and lifestyle coach says AI chatbots could be useful for learning about mental health Esin Pınarlı. They can help you create journaling prompts for reflection, and you can ask them for links to research articles about coping strategies, treatment options, and other questions you have about mental health conditions, she says.
“I don’t see it that way [a substitute for] therapy. I see it as a tool, and I think a tool can be helpful,” says Pinarli, founder of Boca Raton, Fla.-based private practice Eternal Health Consulting. Her clients sometimes talk to ChatGPT about specific situations in their personal lives and then run their answers through it before taking action, she says.
In his personal AI testing, Pınarlı found that chatbots sometimes used language that supported the user’s “unhealthy behavior.” If you ask a chatbot about a confrontation you had with a friend, it might tell you, for example, that your friend was being too sensitive — even if you were the one at fault.
If a transaction with an AI chatbot takes a toll on your mental health, Fortunato suggests asking yourself these questions:
- Is there a reputable source I can cross-check this information with?
- Do I have a provider I can ask these questions to?
Reputable sources may include peer-reviewed scientific studies, articles from health news organizations, or sources from medical organizations such as Harvard Health Publishing or the Mayo Clinic. “Artificial intelligence can really increase people’s access to health information,” says Fortunato. “[But] “AI doesn’t always have to give you the right information.”
Keep these considerations in mind when using AI
Pınarlı and Fortunato agree that people should not use AI chatbots to diagnose or get support during a mental health crisis, especially suicidal ideation. During an active mental health crisis, you can always call or text the Suicide and Crisis Lifeline (988), which is confidential and available toll-free 24 hours a day, seven days a week.
“We saw some really high profile harms“It continued to communicate with people in crisis,” says Fortunato, “especially for young people or vulnerable groups that may be in crisis, where the AI is not able to handle the situation correctly. He did not provide crisis resources. “It didn’t challenge a problematic thought pattern.”
They also both say that you shouldn’t share your medical records or personal identification information with a chatbot because those conversations are not confidential or legally protected. Pınarlı says that you should generally not rely on artificial intelligence to solve problems in your real-life human relationships.
“You need another person with a different nervous system to pay attention to body language and tone of voice,” he says. Chatbots “will not strain you emotionally and do not require reciprocity.”
If you are experiencing a mental health crisis or experiencing mental health symptoms, you can contact the free, confidential National Mental Health Helpline at 1-800-662-HELP (4357).
Want to increase your communication, confidence and success at work? Take CNBC’s new online course, Master Your Body Language to Increase Your Impact.




