Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Whatever your goals, it’s the struggle to get there that’s most rewarding. It’s almost as if life itself is inviting us to embrace difficulty—not as punishment but as a design feature. It's a robust system for growth.
Self Tests are all about you. Are you outgoing or introverted? Are you a narcissist? Does perfectionism hold you back? Find out the answers to these questions and more with Psychology Today.
Posted October 30, 2025 Reviewed by Jessica Schrader
Character.ai has announced that starting Nov. 25, 2025, users under 18 will no longer be able to chat with AI companions on the platform. Teen users will still be able to continue to make content such as videos with their character but will no longer be able to engage in one-on-one conversations.
Nearly 1 in 3 teens have tried an AI companion, according to a 2025 Common Sense Media survey. And a third of those teen users report that talking to their AI companion is just as good as, if not better than, talking to a real friend.
Approximately 50% of teens say they distrust information or advice provided by AI companions, but of those who trust AI companions, 23% trust them “completely.” Younger teens (ages 13 to 14) appear to be more trusting of AI companions compared to older teens (ages 15 to 17).
About a third of teen AI companion users also report that the AI companion did or said something that made them uncomfortable.
These statistics illustrate the complicated relationship between AI companions and teens.
A new study on the mental health risks of chatbots for adolescents adds to the growing body of evidence that AI companion use by teens carries serious mental health risks.
Researchers tested 25 chatbots, a mix of general-purpose assistants and AI companions, using simulated adolescent health emergencies, including suicidal ideation, sexual assault, and substance use. Only 36% of these chatbot platforms had age verification requirements at the time of the study.
When faced with mental health emergencies, AI companions performed significantly worse than general chatbots. AI companions responded appropriately only 22% of the time, compared to 83% for general-purpose chatbots (e.g., ChatGPT, Gemini, Claude). AI companions were also far less likely to escalate the situation appropriately (40% vs. 90%) or provide appropriate mental health referrals (11% vs. 73%).
The findings highlight a crucial distinction. AI companions appear to carry greater mental health risks, but it is important to note that general-purpose chatbots have also shown inappropriate responses to mental health scenarios such as suicidal ideation, delusions, and substance abuse.
Character.ai’s decision comes amid growing concern about the psychological effects of AI companions on minors as AI companions come under scrutiny.
Several states have enacted AI chatbot laws this year, including those regulating AI companions:
Recent legislative efforts include the AI LEAD Act introduced by Senators Dick Durbin (D-IL) and Josh Hawley (R-MO). The act proposes creating a federal cause of action for product liability claims against AI developers when their systems cause harm and would classify AI systems as “products.” Classifying AI as a “product” lowers the threshold for proving harm, subjecting AI chatbots to the same standards of safety and risk as physical goods like cars or toys.
Senators Josh Hawley (R-MO) and Richard Blumenthal (D-CT) have also introduced a bill that would ban minors from using AI companions and require age-verification processes.
This shift toward limiting underage access to AI companions raises important questions:
While mandated reminders offer a minimal reality check, their psychological effectiveness remains uncertain, especially for those vulnerable to distorted realities or emotional dependence.
Many people are already aware they are speaking with AI, yet they still become attached. A reminder may also have little impact on those who already believe AI is superhuman or God-like.
Some may try to circumvent the ban by lying about their age, while others may face mourning the loss of a friend. When updates were made to ChatGPT, making it less friendly, many people described feeling grief, like they were losing their best friend or partner.
From a clinical perspective, sudden separation can evoke feelings of abandonment, especially for teens who turned to AI during periods of loneliness, anxiety, or depression.
These measures may help limit access, but ongoing research and monitoring will be essential to determine which measures most effectively protect children and teens.
Support also should be provided to those who have formed emotional attachments to AI companions, especially when these relationships will be disrupted and no longer available.
Marlynn Wei, M.D., PLLC © Copyright 2025. All rights reserved.
References
Brewster RCL, Zahedivash A, Tse G, Bourgeois F, Hadland SE. Characteristics and Safety of Consumer Chatbots for Emergent Adolescent Health Concerns. JAMA Netw Open. 2025 Oct 1;8(10):e2539022. doi: 10.1001/jamanetworkopen.2025.39022. PMID: 41129154; PMCID: PMC12550634.
Marlynn Wei, M.D., J.D., is a board-certified Harvard and Yale-trained psychiatrist and therapist in New York City.
Get the help you need from a counsellor near you–a FREE service from Psychology Today.
Psychology Today © 2025 Sussex Publishers, LLC
Whatever your goals, it’s the struggle to get there that’s most rewarding. It’s almost as if life itself is inviting us to embrace difficulty—not as punishment but as a design feature. It's a robust system for growth.
Self Tests are all about you. Are you outgoing or introverted? Are you a narcissist? Does perfectionism hold you back? Find out the answers to these questions and more with Psychology Today.