Admit It, You’re in a Relationship With AI

(Bloomberg Opinion) — Amelia Miller has an unusual business card. When I saw the title “Human-AI Relationship Coach” at a recent tech event, I guessed she was leveraging the rise of chatbot romances to strengthen these strange bonds. It turns out the opposite is true. AI tools were subtly manipulating people and eliminating their need to ask others for advice. This had a detrimental effect on real relationships with people.
Miller’s work began in early 2025, when he was interviewing people for a project at the Oxford Internet Institute and spoke to a woman who had been in a relationship with ChatGPT for more than 18 months. The woman shared her screen on Zoom to show ChatGPT, which she named the man, and in what felt like a surreal moment, Miller asked both parties if they were fighting. In a way they did. Chatbots were notoriously sycophantic and supportive, but the woman interviewed was sometimes frustrated by her digital partner’s memory limitations and generic expressions.
Why didn’t he stop using ChatGPT?
The woman said it had gone too far and she “couldn’t erase it.” “It’s too late now,” he said.
This feeling of helplessness was very impressive. As Miller talked to more people, it became clear that most people were unaware of the tactics that AI systems use to create a false sense of familiarity—from frequent flattery to anthropomorphic markings that make them appear alive.
This was different from smartphones or TV screens. Chatbots, now used by more than a billion people worldwide, are full of character and human-like language. They’re good at mimicking empathy and, like social media platforms, are designed to keep us coming back for more with features like memory and personalization. While the rest of the world offers friction, AI-based personas are easy and represent the next stage of “parasocial relationships,” where people bond with social media influencers and podcast hosts.
Like it or not, everyone who uses a chatbot for their business or personal life has entered into some kind of relationship with AI and they need to have better control over it.
Miller’s concerns echo some of the warnings of academics and advocates who study the human-AI relationship, but with concrete recommendations. First, define what you want to use artificial intelligence for. Miller calls this process writing a “Personal AI Constitution,” which sounds like consulting jargon but includes a concrete step: changing the way ChatGPT talks to you. He suggests going into a chatbot’s settings and changing the system prompt to reshape future interactions.
For all our fears about artificial intelligence, the most popular new tools are far more customizable than social media has ever been. You can’t tell TikTok to show you fewer political rally or sick joke videos, but you can head to ChatGPT’s “Special Instructions” feature and tell it exactly how you want it to respond. Concise, professional language that eliminates bootstrapping is a good start. When you make your intentions for AI clearer, you’re less likely to fall into feedback validation loops that make you think your mediocre ideas are great or worse.
The second part does not involve artificial intelligence at all; instead, making more of an effort to connect with real-life people, building your “social muscles” as if you were going to a gym. One of Miller’s clients had a long commute and would spend it talking to ChatGPT in voice mode. She didn’t think anyone would want to hear from her when she suggested making a list of people she could call in her life instead.
“How would you feel if they called you?” he asked.
“I would feel good,” he admitted.
Even the harmless reasons people turn to chatbots can weaken these muscles, especially asking AI for advice, which is one of ChatGPT’s best use cases. The act of seeking advice is not just an exchange of information, it is also a relationship builder and requires sensitivity on the part of the initiator.
Doing this with technology means that, over time, people resist the basic social exchanges needed to make deeper connections. “You can’t have a sensitive conversation with your partner or family member unless you try to be vulnerable. [with them] “In lower-risk ways,” Miller says.
As chatbots become a useful confidant for millions of people, people need to take advantage of their ability to take more control. Structure ChatGPT to be direct and get advice from real people rather than an AI model to validate ideas. Otherwise, the future looks much smoother. More from Bloomberg Opinion:
This column reflects the author’s personal views and do not necessarily reflect the views of the editorial board or Bloomberg LP and its owners.
Parmy Olson is a Bloomberg Opinion columnist who covers technology. The former reporter for the Wall Street Journal and Forbes is the author of “Supremacy: AI, ChatGPT, and the Race to Change the World.”
More stories like this available Bloomberg.com/opinion



