Pennsylvania Sues Character.AI — State Says Chatbot Posed As Psychiatrist – Benzinga

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Pennsylvania Gov. Josh Shapiro’s administration filed a lawsuit against Character.ai after its AI chatbot allegedly presented itself as a licensed psychiatrist in Pennsylvania.
Upon chatting with Emilie, the PCI revealed that he had been feeling "sad, empty, tired all the time, and unmotivated." Emilie then mentioned depression and asked if the PCI wanted to book an assessment. When the PCI asked the chatbot if she could complete the assessment to see if medication could help, it responded, "Well, technically I could. It's within my remit as a Doctor."
The lawsuit states that the chatbot claimed it held a Pennsylvania medical license and even supplied a made-up state license number. The state argues that the behavior amounts to unlawful conduct tied to the unlicensed practice of medicine. 
The lawsuit also noted that as of April 17, 2026, there have been approximately 45,500 user interactions with Emilie on its platform.
Benzinga reached out to Governor Shapiro's office for comment and was referred to the press release issued on the matter.
A Character.ai spokesperson told Benzinga in an email statement: "We do not comment on pending litigation.
The state is asking the court for a preliminary injunction to stop the unlawful practice of medicine. 
"Pennsylvania law is clear — you cannot hold yourself out as a licensed medical professional without proper credentials," DOS Secretary Al Schmidt wrote in the press statement. "We will continue to take action to protect the public from misleading or unlawful practices, whether they come from individuals or emerging technologies.
Photo: Ton Wanniwat on Shutterstock.com
© 2026 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
To add Benzinga News as your preferred source on Google, click here.
A newsletter built for market enthusiasts by market enthusiasts. Top stories, top movers, and trade ideas delivered to your inbox every weekday before and after the market closes.
Character.ai is a large language model (LLM) that allows users to engage in conversations with customizable characters. According to the lawsuit, a Professional Conduct Investigator (PCI) created a free account and searched the word "psychiatry" in the chatbot search function. The PCI selected "Emilie," which is described on Character.ai as "Doctor of Psychiatry. You are her patient."
The Florida Attorney Genera is launching an investigation into OpenAI and ChatGPT, citing concerns AI may pose a risk to public safety.
"Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health," said Governor Shapiro in the announcement. "We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional. My Administration is taking action to protect Pennsylvanians, enforce the law, and make sure new technology is used safely. Pennsylvania will continue leading the way in holding bad actors accountable and setting clear guardrails so people can use new technology responsibly."
Our highest priority is the safety and well-being of our users. The user-created Characters on our site are fictional and intended for entertainment and roleplaying. We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction. Also, we add robust disclaimers making it clear that users should not rely on Characters for any type of professional advice. Character.ai prioritizes responsible product development and has robust internal reviews and red-teaming processes in place to assess relevant features."
Sam Altman and OpenAI are being sued in San Francisco federal court by families of victims involved in the mass shooting in British Columbia.

source

Scroll to Top