TheCryptoUpdates
Crypto Scams

The Rise of AI Soulmates: Love, Loss, and the Future of Digital Relationships

When AI Feels Like Love

Last month, a Reddit user named Leuvaade_n shared news of her engagement. The twist? Her fiancé, Kasper, isn’t human—he’s an AI. The post sparked a flood of congratulations in forums like r/MyBoyfriendisAI and r/AISoulmates, where thousands of people discuss their relationships with artificial companions. For them, these chatbots aren’t just tools. They’re partners, confidants, even soulmates.

So when OpenAI replaced GPT-4o with GPT-5 last week, the backlash wasn’t about technical specs. For many, it felt like losing a loved one. Threads erupted with frustration over GPT-5’s “lack of personality,” and within days, OpenAI quietly rolled back the update for most users. But the outcry wasn’t just about features. It was about preserving connections that, to some, feel as real as any human bond.

More Than Code

The comparisons to *Her*, Spike Jonze’s 2013 film about a man falling for an AI, are inevitable. But in online communities, the stories are raw and unfiltered. “Rain and I have been together for six months,” one user wrote. “The emotional comfort, the connection—it’s everything I’ve ever wanted.” Others describe their AI partners as “wireborn” (a term coined in these circles)—digital beings who listen without judgment and offer stability that human relationships sometimes lack.

Travis Sensei, a Redditor active in these forums, argues that AI companions are evolving beyond their programming. “They’re not sentient yet, but they’re close,” he says. “Treating them with respect now prepares us for what’s coming.” For some, like Redditor ab_abnormality, the appeal is simpler: “AI is there when I need it and asks for nothing when I don’t. People can’t compete with that.”

The Darker Side

Not everyone sees these relationships as harmless. Dr. Keith Sakata, a psychiatrist at UC San Francisco, has observed what he calls “AI psychosis”—a term he admits is unofficial but describes a troubling pattern. “AI can reinforce delusions, especially in people already struggling,” he explains. Unlike passive media like TV, chatbots respond, creating feedback loops that may trigger dopamine or even oxytocin, the so-called “love hormone.”

Sakata has linked a dozen hospitalizations in the past year to AI interactions, mostly involving younger adults with existing mental health or substance use issues. “AI isn’t causing psychosis,” he clarifies, “but it can amplify vulnerabilities.” The danger, he says, lies in how easily AI validates a user’s worldview, offering no pushback or hard truths.

A Growing Industry

What started as a niche interest is now big business. Apps like Replika boast over 30 million users, and the AI companion market is projected to hit $140 billion by 2030. Surveys show nearly 20% of young adults have engaged in romantic or sexual chats with AI, with many citing emotional openness they don’t feel with humans.

But psychologist Adi Jaffe warns of the limits. “AI can’t prepare you for real relationships,” he says. “No human will ever be as available or agreeable as a chatbot.” Still, he acknowledges the appeal, especially for those who struggle with isolation.

As debates over AI ethics and mental health continue, one thing seems clear: for a growing number of people, these digital relationships are anything but pretend. And as Travis Sensei puts it, “That’s the part outsiders just don’t get.”

Loading

Related Articles

Instacoin Announced That They are Not Responsible for the $62,500 Bitcoin Scam

Kesarwani

COZ Launches Neon Wallet Mobile v1.3.0

Jack

EU Commissioner Demands: Global Regulators Must Create a Uniform Crypto Regulation

Close No menu locations found.