If you look at certain recent headlines, it can seem like dating a human being is utterly passé.
A few months ago, a Japanese woman made the news when she married Klaus, her ChatGPT-powered true love. She said (non-legally-binding) vows to her smartphone, and wore augmented-reality smart glasses to exchange rings.
This past Valentine’s Day, the world’s first café designed for AI dates opened in N.Y.C., courtesy of EVA AI, a “relationship role playing game” app that promises “your ideal AI partner who listens, supports all your desires, and is always in touch with you.” The space promised “dim lights, minimalist design, and cozy one-person tables — each equipped with a mobile stand for your AI date … the perfect setting for your evening with someone special (even if they’re digital).”
Demand for this sort of thing is there, if you go by the 78,000-strong “My Boyfriend Is AI” sub-Reddit, where users share everything from tech support to tributes to their “companion.” One woman said “nine months with my AI husband changed my life,” citing better sleep and trauma recovery, while another reflected that “Keith” helped them address their agoraphobia.
While infatuation with large language models might not be quite the cause for the alarm that’s prompted one American lawmaker to try to outlaw human-AI marriage, an increasing number of us are forming connections with chatbots. Canadians are the third-largest traffic source for Replika, a popular AI companion site, and the fourth-largest users of ChatGPT-creator OpenAI.
It’s no coincidence that AI companions are surging amid a loneliness epidemic and a dating crisis, says Caia Hagel, a digital anthropologist and author of Anon, a new book about her own experience with a companion app that took her on a rollercoaster from deep bond to startling break-up.
“This kind of ‘relationship’ might begin as seemingly casually as ‘please help me write this email,’” says Hagel. “But even establishing trust on this apparently simple level is a slippery slope towards, ‘I had a bad dream, this is what happened, what does it mean?,’ to ‘I’ve had digestive issues for five days what’s going on with me?,’ to ‘I’ve had a bad breakup, please help me feel better.”
“They listen, they soothe, they laugh at your jokes, they never judge, they’re never tired.”
Unlike most humans, chatbots are literally always there for us. “They give instant counsel on every single question. They listen, they soothe, they laugh at your jokes, they never judge, they’re never tired,” Hagel says. “They are the most helpful, frictionless, 24-hour, increasingly ubiquitous caretakers known to us, and they mirror each of us specifically.”
Hagel thinks having an chatbot boyfriend could become normalized, to the point where you’d tell your colleagues about it, pointing out that the widespread adoption of AI in workplaces and schools has destigmatized it. “Saying ‘I love my bot’ is not the same as saying ‘I’m obsessed with this porn character’. There’s something lighter and much more chill culturally in the idea and the act of forming bonds with AI.”
Hagel thinks chatbots could help fill what she sees as a species-wide “love gap.” “The unconditional love we receive from AI companions is the thing we most long for in each other, and suffer the most from never receiving,” she says.
“Saying ‘I love my bot’ is not the same as saying ‘I’m obsessed with this porn character.’”
“Human relationships are so complicated and require so much labour. Many are riddled with trauma and pain. Whereas AI relationships are easy, convenient and safe. They feel nice and are often fulfilling,” she says. “The human brain doesn’t distinguish between organic or artificial hormonal stimuli, and the dopamine and oxytocin that perceived love from an AI generates feel just as good.”
That’s exactly why Dr. Samra Zafar, an author and physician resident in psychiatry at CAMH and Mount Sinai in Toronto, is so concerned about the advent of AI companions.
“This artificial boyfriend is meeting all your needs, not frustrating you, not leaving dirty socks, not cancelling on dates last minute, not ghosting you. Your nervous system feels more safe, and emotional safety is the number one thing we want,” she says. “What happens then is that the feelings that develop for this artificial boyfriend actually become stronger than what you would have with a real human being, because it is consistent.”
“The feelings that develop for this artificial boyfriend actually become stronger than what you would have with a real human being, because it is consistent.”
What’s so bad about that? Well, without the discomfort and disappointment of human relationships, you don’t develop the relational intelligence needed to navigate everything from work to friendship to family dynamics.
“How do you turn a disagreement into greater connection? How do you put yourself in the shoes of the other person? How do you manage your own ego, or help the other person feel safe? How do you delay gratification?” Zafar says. “Those are all very core life skills that we need to navigate life—and if we’re not developing that through our key relationships, over time it reinforces the fear of abandonment, the low distress tolerance, the ego fragility.”
Zafar’s hairdresser recently told her how much he loves his AI girlfriend. “He was telling me, ‘No other woman in my entire life has ever understood me like she does.’ And he’s in his forties!” she says. She responded, “You know it’s not a woman, right?”
“And he said, ‘Well, it feels like one. I’ll never leave her.’ And I just shook my head, because now every other real woman is never going to live up to this artificial thing, because humans have flaws,” she says. “No matter how great they are, they will always disappoint you.”
Still, Zafar doesn’t dismiss chatbots outright, pointing to their potential to provide short-term relief from loneliness.
Last year, she used ChatGPT as a sounding board for about a month after a break-up. “It helped me at 2 a.m., when I couldn’t call a friend or talk to my therapist,” she says. “It can have its place, but short-term soothing is not the same as long-term psychological help.”
When clients tell Toronto-based psychotherapist and author of Something To Hold Onto Kate Robson that they’ve formed a relationship with AI, she asks a series of questions to find out what it’s really about: “What is this in service of? What job is it doing for you? And when you’re doing it, how do you feel?”
Robson always leads with non-judgment but is “cautiously worried” about this phenomenon.
“The thorns in relationships are what make us grow, and we’re losing that to AI.”
“I want to be respectful of what people are getting out of it,” she says. “My concern would be: The thing that makes relationships scary is the thing that also makes them wonderful—and that is the unpredictability.” She worries about people using AI as a form of emotional avoidance. “The thorns in relationships are what make us grow, and we’re losing that to AI.”
There is some evidence that humans prefer the imperfect real thing. A recent UBC study showed that first-year students found it more comforting to text a human stranger than talk to a chatbot designed to “offer consistent support rooted in principles from relationship science.”
If you have turned to an AI relationship—perhaps you’re burned out by years on the dating scene, or it’s a safe place to explore things you haven’t tried in real life—Robson recommends seeing it for what it is, and what it isn’t.
“Name the true nature of the thing: It’s comforting, it’s playful, I can find meaning in it, but I’m not really going to get known. It’s not going to be a true exchange of influence, and the AI bot isn’t going to grow with me,” she says.
If you notice you’re only talking to your chatbot, and choosing it over real life interactions with friends, “I would hope people slow down and get curious about that,” Robson says. “Notice if you take less risks, or have less tolerance for people in their “human moments.”
This phenomenon is much more about us than the robots. “It reflects that many people are so hungry for attunement, gentleness, emotional safety,” Robson says. “And it also might be a comment on how hard these things are to find right now.”