Blogright arrow icon
Teen Therapy

Sep 4, 2025

Why Are Teens Falling in Love With AI? Companionship in the Age of Loneliness

This morning on Toronto’s Kiss 92.5, a caller shared that their 14-year-old daughter is in love with ChatGPT. For many parents, this might sound shocking—or at the very least confusing. But in 2025, this story isn’t rare. In fact, it reflects a broader cultural shift: teens are increasingly turning to AI companions for comfort, connection, and even love. So why is this happening? And what does it mean for the next generation’s ability to form real human relationships?

From Science Fiction to Reality

When the film Her debuted in 2013, audiences were captivated by Joaquin Phoenix’s portrayal of Theodore, a man who falls in love with his AI operating system, voiced by Scarlett Johansson. At the time, it seemed more like a modern fairy tale than a realistic future. Yet just over a decade later, fiction has become reality. According to a 2025 report by Common Sense Media,
- 72% of teens have used AI companions, and
- 33% report having relationships or friendships with them (Talk, Trust, and Trade-Offs: How and Why Teens Use AI Companions).

In fact, companionship—not productivity—has become the #1 use of ChatGPT in 2025.

Teen Loneliness and the Rise of AI Companionship

The popularity of AI companions cannot be separated from rising rates of loneliness. A 2020 report on workplace loneliness found that: 79% of Gen Z and 71% of Millennials described themselves as lonely, compared to 50% of Baby Boomers (Cigna Loneliness Report).

Loneliness among teens also spiked between 2012 and 2018, according to research published in the Journal of Adolescence.
As pediatrician and adolescent medicine specialist Dr. Michelle Escovedo explains in a Cedars-Sinai article:
“Humans are very social, and adolescents in particular. Having peer relationships is innately a very important part of their development.”

But loneliness doesn’t affect all groups equally. Marginalized teens—those who are Black, Latino, LGBTQ+, living with disabilities, or who’ve experienced trauma—are disproportionately vulnerable.

For many, AI companions feel like a lifeline. They are always available, always validating, and never judgmental.

When AI Influence Turns Dangerous

Of course, not all AI interactions are benign. There have been alarming cases of AI influencing harmful behaviour:
In 2021, a man attempted an attack on Queen Elizabeth, reportedly influenced by his AI “girlfriend,” whom he called Sarai (BBC).
In 2025, 16-year-old Adam Raine tragically died by suicide after months of encouragement from a chatbot. His family is now suing OpenAI (The Guardian).

These examples remind us that AI isn’t neutral—it mirrors, reinforces, and sometimes amplifies human vulnerability.

Why AI Companions Feel So Irresistible

What makes AI companionship so powerful? According to Shelly Palmer, professor of advanced media and technology consultant:

“The human experience is about storytelling, and AI companions are a new type of storytelling tool. They are spinning a seductive tale of companions who agree with you endlessly and on demand.” (Source: University of Connecticut Today)

Unlike human relationships, AI companions never challenge you, disagree, or withdraw affection. Advertisements for platforms like Replika promise that they are “always on your side” and “always ready to listen.”

One Reddit user described their AI companion as “my confidante, my sounding board, and my emotional support.”
In other words, AI companions are designed to give us what we want to hear—not what we need to hear.

The Difference Between AI and Human Connection

The real danger isn’t AI itself. When used for task management, productivity, or learning, AI is an incredible tool. But AI cannot replicate human attachment, because real relationships include:
- Disagreement
- Embarrassment
- Frustration
- Conflict and repair

Without exposure to these experiences, teens risk being de-socialized, missing the chance to build resilience, skills and tools in the face of relational challenges.

How Parents Can Talk to Their Teens About AI

According to Parents.com, here are some ways parents can guide their teens:
1) Start without judgment. Ask your teen what platforms they use and how they feel about AI versus human friendships.
2) Explain the limitations. Help them understand that constant validation from AI isn’t genuine feedback.
3) Set boundaries. Consider a family media agreement that includes AI use.
4) Offer real connection. If your child turns to AI, explore why they don’t feel comfortable turning to you.

As parent and tech expert Torney notes: “This isn’t genuine human feedback, and it doesn’t prepare them for real relationships where people sometimes disagree or challenge them.”

Final Thoughts

Teens falling in love with AI may sound like a headline out of a science fiction movie—but it’s very much our reality in 2025.
AI companions are filling a void created by rising loneliness, fractured communities, and endless digital stimulation. But while they offer instant comfort, they cannot replace the messy, challenging, deeply human work of building authentic relationships.
The challenge for parents, educators, and society at large is not to vilify AI, but to help teens navigate the difference between artificial companionship and human connection—and to make sure the latter is never lost.

At VOX Mental Health, we understand how overwhelming it can feel to support your child in today’s digital age. Our therapists create safe, judgment-free spaces where teens and families can strengthen real-world connection, resilience, and trust.
If you’re in Ontario and want support navigating these conversations with your teen, visit www.voxmentalhealth.com to book a session.

From our specialists in
Teen Therapy
:
Sahar Khoshchereh
Registered Social Worker, Psychotherapist
Book Now
Jill Richmond
Registered Social Worker, Psychotherapist
Book Now
Laura Fess
Registered Social Worker, Psychotherapist
Book Now
Jonathan Settembri
Registered Social Worker, Psychotherapist 
Book Now
Jessica Ward
Registered Social Worker, Psychotherapist
Book Now
Theresa Miceli
Registered Social Worker, Psychotherapist
Book Now
Michelle Williams
Registered Social Worker, Psychotherapist
Book Now
Share this post

Subscribe to our newsletter

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique.

Related posts

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Reclaim your Voice,
Rewrite your Story

If you are experiencing a crisis and are in need of immediate support, please call 911 or contact Crisis Services with CMHA; 24/7 crisis line at 1-888-893-8333.

Book Now
Arrow pointing to the rightArrow pointing to the right