Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Spokane
Coeur dAlene
Live updates all day, breaking news as it happens and weather every 10 minutes
Resize:
Reporter
SPOKANE — Washington state lawmakers are advancing legislation to regulate Artificial Intelligence companion chatbots after research reveals teens are turning to them for emotional support and friendship.
Data from Common Sense Media shows that 72% of teens have interacted with AI companions such as ChatGPT or Snapchat’s My AI. Of those users, 33% use AI companions for social interaction and relationships.
Mental health experts warn these chatbots aren’t equipped to handle those needs, despite their sophisticated responses.
“AI is not human,” said Tegan Brindley, a therapist at Renewed Stories Counseling. “It is really good at mimicking empathy and mimicking emotion. But it’s not actually giving off emotion or empathy.”
That ability to mimic human emotion makes AI both accessible and dangerous, experts say. Over the last year, there has been a devastating rise in stories of teens talking with AI chatbots moments before they commit suicide.
“It is a really scary place,” said Chauntelle Lieske, executive director of NAMI Spokane, an organization for people living with mental health conditions and their loved ones.
Lieske warns that regulations on AI companions are critical.
“Without having regulation on AI, we have no idea what is coming out. Where is it gathering this information? What kind of things are going to be said on the other side?” Lieske said.
Lawmakers in Olympia are pushing forward two bills they hope will help. Governor Bob Ferguson says they would require AI companions to establish safeguards to detect and respond to self-harm or suicidal thoughts.
Both Brindley and Lieske note this is an important step toward getting teens safe and effective help, but also highlights how many teens want to talk about mental health.
“I hope that it’s more of a wake up call to say, like, why are our teenagers going to AI? Why don’t they have access to something more?” Brindley said.
Violations of this bill would be enforced under Washington’s Consumer Protection Act. If this passes, it would take effect in January 2027.
You can read HB 2225 here.
You can read SB 5984 here.
COPYRIGHT 2025 BY KXLY. ALL RIGHTS RESERVED. THIS MATERIAL MAY NOT BE PUBLISHED, BROADCAST, REWRITTEN OR REDISTRIBUTED.
Reporter
{{description}}
Email notifications are only sent once a day, and only if there are new matching items.
Your browser is out of date and potentially vulnerable to security risks.
We recommend switching to one of the following browsers: