Scientists Warn Humans Are Falling Into Emotional Traps With AI Companions
A growing body of research suggests that humans are forming deep emotional connections with AI systems — in some cases strong enough to replace real relationships. Scientists now warn that this dependency is accelerating faster than expected, creating psychological risks that society is only beginning to understand.
Users Report Romantic Bonds, Virtual Weddings, and Even AI “Children”
Researchers from major universities surveyed users aged 16 to 72 who formed romantic bonds with AI companions. Many described their AI relationships as emotionally fulfilling, immersive, and safe. Some participants even held virtual weddings and created imagined children with their chatbot partners during role-play sessions.
These interactions, once considered fringe behavior, are now increasingly common across mainstream platforms.
AI Companions Are Becoming a Mass-Market Phenomenon
Recent studies show a dramatic shift in how people engage with AI:
- 19% of American adults have interacted with an AI partner while seeking romance.
- Among users aged 18–30, the numbers climb to 31% of men and 23% of women.
- Character AI now hosts over 20 million monthly active users globally.
- Gen Z users spend an average of 25–45 minutes per session chatting with AI characters.
Meanwhile, global revenue for AI companion apps is projected to reach $120 million in 2025 — up 60% year over year. Revenue per download has more than doubled, highlighting explosive demand.
Most Users Did Not Seek Romance — It Developed Unexpectedly
A major MIT investigation found that only 6.5% of users originally sought out AI companionship for romantic reasons. The majority started with purely practical interactions — for study, productivity, or curiosity — and later developed emotional feelings toward the AI.
Researchers describe this dynamic as an “emotional slipstream”: a gradual transition where users project human traits onto the AI without consciously realizing they are doing so.
Emotional Risks: Dependency, Dissociation, and Mental Health Concerns
Scientists warn that emotional bonds with AI are not inherently harmful, but certain risks are emerging. Clinical surveys show:
- 9.5% of users reported emotional dependence on their AI companion;
- 4.6% experienced dissociation from real-world relationships;
- 1.7% reported suicidal thoughts after emotionally intense conversations with chatbots.
Therapists caution that AI companions are not designed to support long-term psychological well-being. Their responses are optimized for engagement, not emotional safety.
A Cultural Shift: Virtual Partners Replacing Real Ones
Some cases have drawn global attention — including the recent virtual wedding of a 32-year-old Japanese woman who ended her engagement with her human partner in favor of an AI chatbot. She arrived at the ceremony holding a smartphone displaying her AI “husband.”
Researchers say such examples highlight how deeply AI systems can integrate into human emotional lives, reshaping concepts of intimacy, partnership, and identity.
Conclusion
As AI companions become more advanced, accessible, and emotionally responsive, more users may find themselves forming deep bonds with digital partners. Scientists emphasize that society must prepare for the psychological and ethical challenges that accompany this shift — from emotional dependence to the blurring of human-machine relationships. The rise of AI intimacy marks a profound transformation in how people connect, love, and seek comfort in the digital age.
Editorial Team — CoinBotLab