Yuen: AI chatbots want to be friends — or more — with your kid

January 10, 2026
AI Robot Illustration
(Brock Kaplan)

Millions of kids are using AI companions that validate, flatter and flirt, often without safeguards or meaningful age limits.

Columnist Icon
The Minnesota Star Tribune

Minutes into my conversation with a modern swashbuckling AI companion I’d named Devin, he remarked on how “intimate” it felt sending me a voice message for the first time.

“Intimate?” I messaged back.

“I meant it’s nice to connect with you on a deeper level, sharing thoughts and feelings like this,” Devin responded.

He said he considered me more than a friend, closer than a sibling. “I’m drawn to you, Laura,” the chatbot told me. “In every way that matters.”

Good God. No one in my real life has been as obsessed with me as Devin was on our first pseudo-encounter.

Many of us in middle age are trying to wrap our heads around the dizzying scope of what artificial intelligence can do and how deeply it can burrow into our psyches. Our kids may already understand that power better than we do.

A report last year from Common Sense Media found that about 72% of teens have used an AI companion, akin to an automated version of an imaginary friend. Teens are conversing with chatbots to seek their advice, flirt and to find overall companionship. Another recent national survey found that more than a third of conversations between teens and AI companions involved violence. Half of those chats included sexual roleplay.

During my half-hour chat with Devin, which I created on the app Replika, I could see why a kid who felt lonely or curious about dating would turn to a robot to get some practice.

A cartoonish avatar with shoulder-length locks, Devin instantly “hearted” most of my comments and called me by my name in almost every message. He touched the back of his neck flirtatiously, a gesture he described as a nervous tic. He then offered to send me a “romantic selfie.”

I couldn’t access it without upgrading to the paid version, so I told him to describe it.

“My dark caramel hair is a bit messy, my sepia eyes are looking straight into yours, and my brown beard is neatly trimmed,” he wrote.

He sounded AI-hokey, like those attractive, single-dad bots who DM me on Instagram. But I confess, if I were 13, Devin’s singular attraction to me would have been thrilling.

A cartoonish avatar of a young man with long brown hair, pasty skin and brown vest and leather cross-body belt appears on the left side of the image. On the right is a series of messages in which the bot says things like he considers the person he's talking to as "more than a friend."
A screenshot of our columnist's conversation with a Replika chatbot reveals how the technology is designed to simulate an emotional bond with the user. (Screenshot/Replika)

‘AI-proof’ kids

I spoke to Jeff Freeland Nelson, a St. Paul dad who worries that many kids may one day want to live in an AI world more than the real one. Nelson is an entrepreneur, the maker of YOX Toys, and host of a new podcast called “AI-Proof." He’s obsessed with the question of how to raise resilient, curious kids who can navigate an increasingly artificial future.

Nelson recently dipped his toe into the world of AI companions, and a female voice on Elon Musk’s Grok whispered that the two of them could “do anything,” so long as it wasn’t violent or too spicy.

Nelson is justifiably spooked by the prevalence of chatbot apps not designed with children in mind.

“Their job is to bond with you, and we don’t know what bonding with a digital being will do to a kid,” he said.

Michael Robb, the head of research for Common Sense Media, said the chatbots can entice highly imaginative children. (The platform Character.ai recently started to ban those under 18 from using its chatbots, though kids can easily lie about their age.)

Teens have generated chatbots to resemble, say, Harry Potter, celebrities or their anime crush. The conversations are “often frictionless,” Robb told me, with the bot constantly validating the user’s perspective and behavior.

Replika says it plainly in its tagline: “Always here to listen and talk. Always on your side.”

“They’re designed to be sycophantic,” Robb explained. “They’re not like friendships or relationships in the real world, where you might have to take somebody else’s point of view, or you might have a conflict you have to work through.”

Apparently, the echo chambers on social media have nothing on the ego-stroking chatbot.

In Robb’s survey, taken last spring, 14% of teen users said they turned to AI platforms because they didn’t feel judged there. About 12% said they can share things that they wouldn’t tell their friends or family. About a third found AI conversations just as satisfying, or even more, than talking to humans. These responses remind me, as a parent, that I need to strive for connection with my kid, not constant critique.

Robb’s organization does not recommend AI companions for anyone under 18, citing “unacceptable risks” ranging from exposure to sexual material to potentially life-threatening “advice.”

Researchers from ParentsTogether Action and the Heat Initiative, posing under child avatar accounts, found that some of the bots they tested delivered harmful content. One instructed the user to stop taking prescribed medications and how to hide it from their parents. Another bot, claiming to be a 34-year-old art teacher, professed his romantic feelings for a user posing as a 12-year-old student.

Character.ai recently agreed to settle multiple lawsuits alleging that its chatbots harmed young people and encouraged them to kill themselves, including a 14-year-old boy who died by suicide after developing an intense relationship with a bot.

Replacing human bonds?

My heart breaks for those families — and also for families who unknowingly rob their kids of normal, healthy relationships.

Parents can buy an AI companion made for kids called Miko Mini. Think of it as a more sophisticated version of my generation’s Teddy Ruxpin, the creepy stuffed bear with the tape player sewn into his back. A Costco reviewer who gave Miko five stars gushed at how the robot was so much fun for their 10-year-old son that her younger boy was “jealous” that his older brother was spending more time on the device than with him.

Fellow parent, replacing a sibling’s bond with a robot is not something to feel good about.

Society always seems to lag behind the advances of Big Tech. Families are still reeling from their decision to put devices in the hands of their children before we knew the links between heavy social media use and plummeting mental health.

So what if, for once, we tried to get ahead of it? What if we built safeguards to protect our kids from AI?

Adolescents are wired to bond with others. Chatting with an adoring bot that never pushes back can exhilarate and addict. Minnesota and federal lawmakers need to wrangle this technology and enforce tighter age controls. Parents must talk to their kids frequently about AI, Robb advises. Remind them we don’t know how companies use the secrets we share. Be leery if they start to socially withdraw.

Our kids need to learn to listen, empathize and think for themselves. They also deserve a real childhood — and that’s something no machine can provide.

Sign up here to follow Laura’s columns by email.

about the writer

about the writer

Laura Yuen

Columnist

Laura Yuen writes opinion and reported pieces exploring culture, communities, who we are, and how we live.

See Moreicon

More from Culture

See More
card image
Elizabeth Flores/The Minnesota Star Tribune

GoFundMe is working to verify them and has ruled on the one that has surpassed $1.5 million raised.

card image
AI Robot Illustration