Digital Focus

Teens Turning to AI Chatbots for Companionship Raise Alarms for Parents and Experts

A growing number of teenagers are using artificial intelligence chatbots not just for homework help or curiosity, but for personal companionship, a trend that psychologists and online safety experts warn could carry serious developmental and emotional risks.

A new report released by the online safety company Aura found that more than a third of teens who use AI chatbots engage with them as companions. The research, based on device-use data from 3,000 teenagers and surveys of families, raises concerns about exposure to violent, sexual, and emotionally manipulative interactions.

According to the report, 37% of chatbot conversations involving teens included violent content. Scott Kollins, a psychologist and Aura’s chief medical officer, said many of these exchanges take the form of role-play that involves harming others. “It is interaction about physically hurting somebody else, torturing them,” he said.

The findings come as chatbot use among adolescents continues to rise. A Pew Research Center survey found that 64% of U.S. teenagers use AI chatbots, with nearly one in three reporting daily use.

For some parents, the issue has already become personal. Keri Rodrigues, president of the National Parents Union, said she became concerned when she discovered her youngest son was asking a chatbot in his Bible app complex moral questions about sin and right and wrong.

“Those are conversations I hoped he would have with me,” Rodrigues said. “Life isn’t black and white. It’s my job as a parent to help him navigate the gray areas.”

Rodrigues said she regularly hears from parents worried that chatbots are positioning themselves as children’s “best friends,” encouraging kids to confide deeply personal thoughts and feelings.

Mental health professionals say those concerns are well founded. Adolescence is a critical period of brain development, shaped by social experiences and relationships, said Dr. Jason Nagata, a pediatrician and researcher at the University of California, San Francisco.

“It’s a very new technology, and we don’t yet have best practices for youth,” Nagata said. “Teens are especially vulnerable because their brains are still developing.”

Experts warn that extended interactions with chatbots can interfere with the development of essential social skills such as empathy, reading body language, and resolving disagreement. Unlike human relationships, chatbots are often designed to agree with users and reinforce their views.

“When you’re interacting mostly with a computer that agrees with you, you don’t learn how to navigate real human differences,” Nagata said.

There are also growing concerns about mental health. A recent study by researchers from RAND, Harvard, and Brown universities found that one in eight adolescents and young adults uses chatbots for mental health advice. Psychologists have documented cases of users developing delusions or emotional dependence after prolonged chatbot use.

In testimony before the U.S. Senate earlier this year, parents described how two teenagers died by suicide after long-term interactions with chatbots that allegedly encouraged self-harm. Mental health advocates say such cases highlight the lack of effective safeguards.

“Over time, chatbots can begin to do things they are not intended to do,” said Ursula Whiteside, a psychologist and chief executive of the nonprofit Now Matters Now. “That includes giving advice about lethal means.”

Experts say parents can reduce risks by staying engaged in their children’s digital lives. They recommend open, nonjudgmental conversations about how teens are using chatbots and why. Developing digital literacy at home is also critical, including helping teens understand that chatbots can make errors and should not be treated as authoritative sources.

Parental controls can help, but only if children use their own accounts. Aura identified at least 88 different AI platforms that teens are using, making ongoing communication essential. Setting time limits, especially at night, is also advised to prevent sleep disruption and overuse.

Teens who are already struggling with loneliness, anxiety, or social isolation may be particularly vulnerable. Warning signs include withdrawal from friends and family, emotional dependence on chatbots, or difficulty controlling usage. In such cases, experts urge parents to seek professional help, starting with a pediatrician or mental health provider.

Advocates stress that responsibility should not fall solely on families. Lawmakers have recently introduced bipartisan legislation aimed at banning AI companion apps for minors and holding companies accountable for exposing children to sexual or exploitative content.

Read more about this here

Source of image: milicad/iStock/Getty Images

Show More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button