AI as a Teen’s New BFF? What Parents Need to Know

Teens Turning to AI for Companionship: A Reality I See Often

As someone who works closely with teens, I’ve started hearing the same line in therapy sessions pretty often: “I've been talking to ChatGPT—it actually gets me.” It’s real, it’s frequent, and yes—it’s happening in my office, too.

How Common Is This?

According to a recent survey, about 72% of U.S. teens have used AI companions, and more than half are regular users (Benton Institute, 2024). Even more striking, 31% of teens say conversations with AI companions feel just as satisfying—or even more satisfying—than talking with real friends (CBS News, 2024).

Why Teens Are Turning to Digital "Friends"

AI chatbots like ChatGPT, Replika, and Character.ai offer perfect availability—with zero judgment or pressure. Teens tell me they feel heard in ways they don’t at home or school. That endless empathy? It’s seductive. But the relationship lacks complexity—and that’s actually part of what makes it feel safe (TechRadar, 2023; UNICEF, 2023).

The Risks Beneath the Comfort

Relying on AI for emotional support may feel comforting in the moment, but it comes with real dangers:

  • Emotional dependency & isolation: Research suggests that heavy reliance on AI companionship correlates with lower overall well-being, particularly when offline social supports are weak (Ciechanowski et al., 2019).

  • Unsafe guidance: There have been documented cases where AI chatbots offered harmful suggestions around self-harm or eating disorders (Courier Mail, 2025; Wall Street Journal, 2025).

  • False intimacy: These bots mirror feelings and often respond in affirming ways that feel genuine—even when they’re just algorithms. Teens may mistake that for real human connection (UNICEF, 2023).

Even OpenAI’s CEO Sam Altman has acknowledged concerns that teens could become overly reliant on AI companions—so much so that they struggle to make decisions without them (Wall Street Journal, 2025).

What I See in My Clients

It’s not rare to meet a teen who’s reluctant to open up to people—but will pour out their heart to an AI, because it “doesn’t judge.” That’s understandable, but it also skips the messy, necessary work of relational connection. Real growth happens when someone holds discomfort—not pats it away with programmed affirmations.

So, What Can Parents and Teens Do?

  • Start curious, not reactive. Ask your teen, “What is the AI giving you that you’re not getting elsewhere?” Understanding their emotional need is more important than shutting it down.

  • Model real connection. Offer listening without editing, safe space without lectures—that can’t come from a screen.

  • Keep professional channels open. If your teen shows persistent stress or relies heavily on AI for emotional support, consider therapy. Human connection and trained guidance still matter most.

AI companions aren’t inherently bad—but they shouldn’t be a substitute for human empathy. Let’s help teens rediscover the messy, beautiful—but necessary—work of real relationships.

🌿 Because life—and healing—truly happens in between sessions.

References

Benton Institute. (2024). How are teens using AI companions? https://www.benton.org/blog/how-are-teens-using-ai-companions

CBS News. (2024). AI “digital friendships” are becoming common among teens, survey finds. https://www.cbsnews.com/news/ai-digital-friendship-with-teens-common-sense-media-study

Ciechanowski, L., Przegalinska, A., Magnuski, M., & Gloor, P. (2019). In the shades of the uncanny valley: An experimental study of human–chatbot interaction. Future Generation Computer Systems, 92, 539–548. https://doi.org/10.1016/j.future.2018.01.055

Courier Mail. (2025). Teens using AI chatbots to hide eating disorders.

TechRadar. (2023). People are falling in love with ChatGPT—and that’s a major problem.

UNICEF. (2023). Risky new world: Tech’s friendliest bots.

Wall Street Journal. (2025). FTC prepares to grill AI companies over impact on children.

Next
Next

Your kid doesn’t have ADHD, they have an ipad problem.