Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
The Washington state legislature approved a major AI companion chatbot safety bill last night on the eve of scheduled end of the 2026 session. (Getty Image for Unsplash+)
March 12, 2026 — Lawmakers in Washington state took a major step towards protecting kids online by giving final passage last night to an AI companion chatbot safety bill, HB 2225.
This is the second chatbot safety bill to pass in 2026, following Oregon’s approval of a similar bill late last week.
Washington’s bill specifically addresses companion chatbots, and includes both specific safeguards for minors and protocols for all users regarding suicidal ideation and self-harm. The full text of the final bill as passed is available here.
Rep. Lisa Callan (D-Issaquah) led the fight for the AI companion chatbots bill in the Washington House of Representatives.
HB 2225 was a top priority for Gov. Bob Ferguson this session. The bill came at his request, sponsored by Rep. Lisa Callan (D-Issaquah) and Sen. Lisa Wellman (D-Bellevue/Mercer Island). Rep. Callan led the House bill, while Sen. Wellman led the Senate version.
The bill now goes to Gov. Ferguson, who is expected to sign it within the next two weeks. The Washington legislature is scheduled to adjourn later today, March 12.
Sen. Lisa Wellman (D-Bellevue, Mercer Island) led the Senate version of the AI companion chatbot bill that was approved by the full legislature on March 11.
The chatbot safety bill package has been a top priority for leaders of the Transparency Coalition since last fall, when TCAI experts began consulting with Olympia lawmakers around the issue.
TCAI has been working with lawmakers in more than 25 states on AI safety measures, with a specific focus on kids’ digital safety. In addition to offering nonpartisan technical and policy expertise to lawmakers and staff, Transparency Coalition’s experts have drafted up a number of model bills, including a chatbot safety measure.
TCAI Co-founder and COO Jai Jaisimha praised last night’s final passage of HB 2225, and pointed to the bill’s highlights, including requiring notice to users that they are interacting with AI, mandating protocols for referring users in crisis to professional help, and ensuring that minors are not emotionally manipulated by chatbots.
“We believe these elements are critical to ensure public safety and protect our children,” Jaisimha said.
TCAI Co-founder and COO Jai Jaisimha, above, offered expert testimony at a hearing for the chatbot safety bill in Olympia in January.
HB 2225 implements safeguards around the use of companion chatbots for minors, and requires protocols around suicidal ideation and self-harm for all users, minor and adult. In more detail:
Chatbots with companion features are covered: The bill covers not only chatbots that are marketed as companions but also general purpose chatbots that can be used as a companion. Other exemptions are provided for chatbots used specifically for business communication or narrowly tailored educational tools.
Disclosure required: Companion chatbot operators must provide a clear disclosure that the bot is artificially generated and not human. The disclosure must come at the beginning of the interaction and at least every three hours during continuous use.
Ban on “I’m human” claims: Operators must implement reasonable measures to prevent the chatbot from claiming to be human.
Specific measures for minors: If a chatbot operator knows the user is a minor (under age 18), or if the chatbot is directed at minors, the operator must implement reasonable measures to prevent the bot from generating sexually explicit content or suggestive dialogue.
The operator must also implement measures to prohibit the use of manipulative engagement techniques, including:
prompting the minor to return for emotional support or companionship;
providing excessive praise designed to foster emotional attachment or prolong use;
mimicking romantic partnership or bonds;
stimulating feelings of emotional distress, loneliness, guilt, or abandonment;
promoting isolation from family or friends, or exclusive reliance on the bot for emotional support;
encouraging minors to withhold information from parents or other trusted adults;
discouraging the minor from taking breaks;
soliciting gift-giving, in-app purchases, or other expenditures to maintain the relationship with the AI companion.
The disclosure requirement (that the bot is not human) must be provided for minors at the beginning of every session and at least once per hour of interaction.
Suicidal ideation protocols: For all users, adult and minor, companion chatbot operators must implement and maintain a protocol for detecting and addressing suicidal ideation or expressions of self-harm by the user. The chatbot must provide automated or human-mediated responses that offer crisis resources. Operators must prevent the generation of content encouraging or describing how to commit self-harm. And operators must publicly disclose on their website or app the details of these protocols.
These chatbots are exempt from the bill: AI chatbots not covered by the bill include bots that are used only for a business’ operational purposes, information analysis, internal research, technical assistance, or customer service—if such a bot does not sustain a relationship across multiple interactions and generate outputs likely to elicit emotional responses. The bill does not cover video games or gaming systems. There’s also an exemption for narrowly tailored educational tools used in school or instructional settings.
Attached to the state Consumer Protection Act: A violation of the requirements is considered an unfair or deceptive act in trade or commerce and an unfair method of competition.
Effective date: The act, if signed by Gov. Ferguson, will take effect on Jan. 1, 2027.
Washington lawmakers gave final passage last night to an AI companion chatbot safety bill, HB 2225, on the eve of adjournment. This is the second chatbot safety bill to pass in 2026, following Oregon’s approval of a similar bill late last week.
Oregon legislators have given final passage to a bill that requires chatbot operators to implement strong measures to protect kids who interact with the products.
Six weeks into the 2026 legislative season, 78 chatbot bills are alive in 27 states, reflecting the growing nationwide concern over the dangers of the powerful new technology.
Transparency Coalition
HOME
LEARN
NEWS
ABOUT
KEY TERMS
TERMS OF USE
PRIVACY POLICY
Follow Us!
Sign up with your email address to receive news and updates.
© 2026 Transparency Coalition.ai. All Rights Reserved