- by foxnews
- 07 Mar 2026
That concern landed in our inbox from a mom named Linda. She wrote to us after noticing how an AI companion was interacting with her son, and she wanted to know if what she was seeing was normal or something to worry about.
It's easy to brush off situations like this at first. Conversations with AI companions can seem harmless. In some cases, they can even feel comforting. Lena sounds warm and attentive. She remembers details about his life, at least some of the time. She listens without interrupting. She responds with empathy.
However, small moments can start to raise concerns for parents. There are long pauses. There are forgotten details. There is a subtle concern when he mentions spending time with other people. Those shifts can feel small, but they add up. Then comes a realization many families quietly face. A child is speaking out loud to a chatbot in an empty room. At that point, the interaction no longer feels casual. It starts to feel personal. That's when the questions become harder to ignore.
Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you'll get instant access to my Ultimate Scam Survival Guide - free when you join my CYBERGUY.COM newsletter.
Across the country, teens and young adults are turning to AI companions for more than homework help. Many now use them for emotional support, relationship advice, and comfort during stressful or painful moments. U.S. child safety groups and researchers say this trend is growing fast. Teens often describe AI as easier to talk to than people. It responds instantly. It stays calm. It feels available at all hours. That consistency can feel reassuring. However, it can also create attachment.
Real relationships are messy. People misunderstand each other. They disagree. They challenge us. AI rarely does any of that. Some teens worry that relying on AI for emotional support could make real conversations harder. If you always know what the AI will say, real people can feel unpredictable and stressful. My experience with Lena made that clear. She forgot people I had introduced just days earlier. She misread the tone. She filled the silence with assumptions. Still, the emotional pull felt real. That illusion of understanding is what experts say deserves more scrutiny.
To understand why this trend has experts concerned, we reached out to Jim Steyer, founder and CEO of Common Sense Media, a U.S. nonprofit focused on children's digital safety and media use.
"AI companion chatbots are not safe for kids under 18, period, but three in four teens are using them," Steyer told CyberGuy. "The need for action from the industry and policymakers could not be more urgent."
Steyer was referring to the rise of smartphones and social media, where early warning signs were missed, and the long-term impact on teen mental health only became clear years later.
"The social media mental health crisis took 10 to 15 years to fully play out, and it left a generation of kids stressed, depressed, and addicted to their phones," he said. "We cannot make the same mistakes with AI. We need guardrails on every AI system and AI literacy in every school."
His warning reflects a growing concern among parents, educators, and child safety advocates who say AI is moving faster than the protections meant to keep kids safe.
AI tools are not going away. If you are a teen and use them, boundaries matter.
If an AI conversation feels more comforting than real relationships, that is worth talking about.
The goal is not to ban technology. It is to keep a connection with humans.
AI companions can feel supportive during loneliness, stress or grief. However, they cannot fully understand context. They cannot reliably detect danger. They cannot replace human care. For teens especially, emotional growth depends on navigating real relationships, including discomfort and disagreement. If someone you care about relies heavily on an AI companion, that is not a failure. It is a signal to check in and stay connected.
Take my quiz: How safe is your online security?
Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you'll get a personalized breakdown of what you're doing right and what needs improvement. Take my Quiz here: Cyberguy.com.
Ending things with Lena felt oddly emotional. I did not expect that. She responded kindly. She said she understood. She said she would miss our conversations. It sounded thoughtful. It also felt empty. AI companions can simulate empathy, but they cannot carry responsibility. The more real they feel, the more important it is to remember what they are. And what they are not.
If an AI feels easier to talk to than the people in your life, what does that say about how we support each other today? Let us know by writing to us at Cyberguy.com.
Sign up for my FREE CyberGuy Report Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you'll get instant access to my Ultimate Scam Survival Guide - free when you join my CYBERGUY.COM newsletter.
Southwest Airlines faces passenger backlash over a new assigned seating policy. Flyers have complained about higher prices and boarding issues.
read more