Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Businesses are increasingly turning to chatbots to interface with consumers, job applicants, and employees. And as the gap between human and AI capabilities continues to narrow, there is grave concern about the social and emotional repercussions of people interfacing with “intelligent” machines, especially minors. Against this backdrop, Washington Governor Bob Ferguson signed House Bill 2225 into law on March 24, establishing parameters around AI-powered chatbots that act like friends or companions. The law takes effect January 1, 2027. Washington often spurs the passage of copycat legislation in other states, so you’ll want to pay attention to this trend, even if you operate in another state. There are also several bills currently pending in other state legislatures on the same topic. Washington was just the first to pass such a bill in this year’s legislative session. This Insight will cover everything you need to know to maintain compliance and implement best practices.
What the New Law Does
HB 2225 contains five key features that businesses should be aware of:
1. Defining “AI Companion Chatbot”
Washington’s law specifically targets chatbots that simulate emotional relationships and sustain ongoing, personalized conversations with users. The law distinguishes these chatbots from those that are “only used for a business’ operational purposes, productivity, and analysis” (like customer service prompts that appear when users visit a corporate website), since the latter fill a narrowly defined, temporally limited purpose.
2. Requiring Mandatory Disclosure
For chatbots classified as “AI Companion Chatbots,” HB 2225 will require disclosure to users that the bot is a non-human machine at the outset of every interaction regardless of the user’s age. For lengthy conversations, this alert must be redisplayed every three hours. This changes to every hour if the user is under 18 years old, or if the companion chatbot is specifically directed towards minors.
3. Limiting Topics of Conversation
Since companion chatbots are meant to blur the line between human and machine, HB 2225 prohibits them from discussing certain emotionally triggering topics, such as suicide, self-harm, and eating disorders. If users try to engage a companion chatbot in conversations around these topics, the chatbot will be required to have a functionality that directs users to mental health professionals.
4. Enhancing Protection for Minors
Beyond requiring a more frequent recurring disclosure that a companion chatbot is not human, Washington’s law includes additional provisions designed to protect minors. Specifically, the law prohibits the bot from generating sexually explicit or suggestive content and using engagement techniques that are considered “manipulative.”
On the manipulative front, regulators’ stated intent is to prevent the AI companion chatbot from engaging in or prolonging an emotional relationship with a minor, and bars the following:
Notably, HB 2225 is the first legislation with prohibitions of this nature and could very well be replicated across other statehouses.
5. Punishing Violations at Multiple Levels
Section 6 of the law creates a private right of action by making clear that a violation “is an unfair or deceptive act in trade or commerce and an unfair method of competition for the purpose of applying the consumer protection act.” This means that organizations operating chatbots that run afoul of the law are subject to both statutory damages and a private right of action (the right to sue) from aggrieved parties.
Evaluating Your Risk
Any organization that operates a chatbot should have a clear and comprehensive understanding of its capabilities and functionalities. Some key questions to consider include:
Answering these questions will help determine whether an organization’s bot falls under the purview of an “AI Companion Chatbot” – and is thus subject to HB 2225 – or is exempted as a narrowly focused business tool.
4 Action Steps for Employers
Regardless of whether your business is subject to Washington’s new law, any entity that uses chatbots should consider following best practices:
See more »
DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.
© Fisher Phillips
Refine your interests »
Join more than 70,000 authors publishing their insights on JD Supra
Back to Top
Explore 2026 Readers’ Choice Awards
Copyright © JD Supra, LLC