Artificial Intelligence

One-Third of Teens Are as ‘Satisfied’ Talking to a Chatbot as a Real Person

By Alyson Klein — July 16, 2025 4 min read
Photo of a boy's hands using a smartphone and typing back and forth with a chatbot. There is a screened box floating around the phone that shows a back and forth conversation with the artificial intelligence system.
  • Save to favorites
  • Print

Chad Sussex, an assistant principal at Winterset High School in Iowa, felt “blown away” when he read a news story—passed along by a teacher at his school—about a teenager who died by suicide after communicating with an artificial intelligence chatbot.

Sussex, who leads the AI task force for the Winterset Community Schools, realized immediately that his district should craft policy guidance and conduct educational outreach to “get ahead” of the grim potential downsides of AI companions, digital friends or characters designed to give personal, meaningful feedback to life’s important questions.

Other districts should consider following his lead, according to a report released July 16 by Common Sense Media, a research and advocacy organization focused on youth and technology. The report found that nearly three quarters of teens have engaged with an AI companion at some point, with more than half saying they are regular users of the technology.

See Also

Brightly colored custom illustration showing a young male looking at a phone. His mind is being completely distorted in the process with a pixelated digital texture.
Taylor Callery for Education Week

That finding is “eyepopping” on sheer scale alone, said Michael Robb, the head of research at Common Sense Media. In fact, the percentage was so high Robb wonders if some teens conflated AI companions with the large language models that power them—such as ChatGPT—even though Common Sense’s survey question clearly defined AI companions.

Even if the usage numbers are a bit inflated by that potential confusion, “it’s likely still a lot of kids using” the companions, Robb said in an emailed response to questions from Education Week.

The survey was based on a representative sample of 1,060 teens, ages 13 to 17. It was conducted in April and May of this year.

AI companions aren’t designed to interact like real humans

Teenagers turn to AI companions—which can be accessed through platforms such as CHAI, Character.AI, Nomi, and Replika—for social connection and to talk through problems that they wouldn’t bring to someone close to them, Common Sense Media found.

About a third of students who have used AI companions say they’ve done so for social interaction. Roughly one in five say they’ve consulted one for social or conversation practice. And more than one in 10 have turned to the technology for mental health advice or emotional support.

Many teens who use AI companions—about a third—do so because it’s entertaining, the survey found. And a similar percentage said that the tech’s responses had sometimes made them feel uncomfortable.

But others found qualities in the technology that they might feel are lacking in their peers or the adults around them, according to the survey.

For instance, 18% of teens surveyed they talk to the bots because they “give advice.” Seventeen percent said the AI companions are “always available” to listen. And 14% said they rely on AI companions because they “don’t judge me.” Another 12% said they feel comfortable telling the bots things they wouldn’t say to a friend or family member.

What’s more, about 1 in 3 teens who use AI companions said they find their time with the technology to be more satisfying than time with real-life friends. About a third have chosen to talk to AI about something important or serious, rather than turning to a real person.

That last finding stood out to Robb.

“I’m not someone ringing the doomsday bell yet for AIs replacing human interaction, but I think that’s a worrying number, and not something I’d want to see grow over time,” Robb said.

AI companions aren’t designed to mimic real human interaction, which naturally includes disagreement and friction, Robb added. Instead, they are developed to “be agreeable and validating,” he said. Educators should be aware that these AI tools were “not designed with children in mind.”

Sussex believes teenagers are especially “vulnerable” to a tech tool that doesn’t challenge their thinking.

“They’re impressionable,” he said. “Let’s say you start up a conversation with a particular AI [tool], and it starts saying things to you that you wish you were hearing from a friend, from a parent, from someone else that’s close to you. You start trusting it [and thinking] ‘I can say anything at all, because [the bot is] not going to go in and share with someone else in my class.’”

Another quarter of teens who’ve used the companions say they’ve shared personal information, including their real name, location, or secrets with an AI companion. That’s problematic, because nearly everything that’s inputted into generative AI is used to help the technology generate better responses, according to AI experts.

Educators should look out for students who seem attached to AI companions

Common Sense Media recommends educators help students understand that AI tools are designed to “create an emotional attachment” with users and explain how that kind of interaction is different from how real humans communicate.

The organization also suggests educators be trained to look out for students who talk about AI companions as if they were real friends or feel social distress when the chatbots are unavailable. Educators should also help students understand the downsides of giving private information to AI tools.

Winterset schools, for one, is already preparing presentations on AI companions, which will be shared with teachers, parents, and students this fall, Sussex said.

“We need to educate them on what this technology is and the possible things that it can do if you take it too far, if you take it to heart too much,” he said.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Special Education Webinar
Integrating and Interpreting MTSS Data: How Districts Are Designing Systems That Identify Student Needs
Discover practical ways to organize MTSS data that enable timely, confident MTSS decisions, ensuring every student is seen and supported.
Content provided by Panorama Education
Artificial Intelligence Live Online Discussion A Seat at the Table: AI Could Be Your Thought Partner
How can educators prepare young people for an AI-powered workplace? Join our discussion on using AI as a cognitive companion.
Student Well-Being & Movement K-12 Essentials Forum How Schools Are Teaching Students Life Skills
Join this free virtual event to explore creative ways schools have found to seamlessly integrate teaching life skills into the school day.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence Letter to the Editor I’m Pro-Technology, But AI’s Role in Education Worries Me
A parent shares his concerns with artificial intelligence in K-12.
1 min read
Education Week opinion letters submissions
Gwen Keraval for Education Week
Artificial Intelligence 'Grok' Chatbot Is Bad for Kids, Review Finds
The chatbot on X suggests risky behavior, and is unsafe for teens, Common Sense Media says.
4 min read
Workers install lighting on an "X" sign atop the company headquarters, formerly known as Twitter, in downtown San Francisco, July 28, 2023. Grok is the artificial intelligence chatbot built into the social media platform X.
Workers install lighting on an "X" sign atop the company headquarters of X, a social media platform formerly known as Twitter, in San Francisco on July 28, 2023. Grok is the artificially intelligent chatbot built into the social media platform.
Noah Berger/AP
Artificial Intelligence States Put 'Unprecedented' Attention on AI's Role in Schools
Most of the bills address AI literacy and require guidance on responsible use of the technology.
4 min read
Image of AI in a magnifying glass superimposed over an aerial view of a school.
Collage via EdWeek and Getty
Artificial Intelligence 'Dangerous, Manipulative Tendencies’: The Risks of Kid-Friendly AI Learning Toys
Toys powered by AI often generate inappropriate responses to questions.
4 min read
Photo illustration of a 3d rendering of a chatbot hovering over a motherboard circuit.
iStock/Getty