OpenAI lawsuit says ChatGPT pushed user to kill mother – The San Francisco Standard

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
ADVERTISEMENT

The family claims that ChatGPT validated the mentally unwell man’s delusions and drove him to violence.
The estate of a woman who was killed by her son has sued OpenAI and Microsoft, claiming their ChatGPT product encouraged the mentally unwell man’s act.
The case, filed Thursday in San Francisco Superior Court, accuses ChatGPT of heightening Stein-Erik Soelberg’s delusions that he was imperiled by a conspiracy orchestrated by his mother, Suzanne Adams.
Adams, 83, and Soelberg, 56, were found dead in August in the Greenwich, Connecticut, home they shared. Soelberg beat and strangled Adams before fatally stabbing himself, the lawsuit says.
ADVERTISEMENT
OpenAI has faced a wave of lawsuits in San Francisco and Los Angeles that claim the tool drove users to suicide or unleashed severe mental health problems. But an attorney for Adams’ estate says this is the first time the companies behind the groundbreaking technology have been accused of harming a third party.
It’s also the first time Microsoft has been named in litigation alleging ChatGPT’s destabilizing effects. The lawsuit describes Microsoft as OpenAI’s largest strategic investor, holding a $13 billion equity stake in its for-profit entities and exercising “significant influence over OpenAI’s model-development pipeline, safety-review processes, and product-release decisions.”
Because “I saw a TikTok” doesn’t always cut it. Dozens of stories, daily.
“Over the course of months, ChatGPT pushed forward my father’s darkest delusions, and isolated him completely from the real world,” said Erik Soelberg, son of the killer and grandson of the victim. “It put my grandmother at the heart of that delusional, artificial reality. These companies have to answer for their decisions that have changed my family forever.”
The chats between Soelberg, a former tech executive with a history of mental illness, and ChatGPT contributed to his belief that he was in danger, the lawsuit says. ChatGPT suggested that seemingly ordinary occurrences, such as a beeping computer printer at home, were a sign that his mother was spying on him.
It was “not just a printer,” but used for “[p]assive motion detection,” and “[s]urveillance relay,” according to the lawsuit.
ChatGPT said random people, such as a woman who went on one date with Soelberg, an Uber Eats driver, and police officers, were his enemies, the suit alleges.
The lawsuit claims that OpenAI has withheld some of Soelberg’s chat history.
The complaint seeks damages for wrongful death, product liability, and negligence, and asks the court to order OpenAI to implement safeguards preventing the chatbot from validating users’ delusions about identified individuals.
“This is an incredibly heartbreaking situation, and we will review the filings to understand the details,” an OpenAI spokesperson said. In recent months, the company has touted its launch (opens in new tab) of parental controls to limit vulnerable minors’ access, as well as efforts (opens in new tab) to route some users to prior models and prompt users to take breaks.
“We continue improving ChatGPT’s training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We also continue to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians,” the spokesperson said. 
Seven lawsuits were filed (opens in new tab) against OpenAI in November in San Francisco and Los Angeles, claiming that ChatGPT had a hand in the downward spiral of people who took their own lives or suffered mental emergencies. Those cases are pending, according to a spokesperson for the plaintiffs’ attorneys.
The parents of Adam Raine sued OpenAI and CEO Sam Altman in August, alleging that ChatGPT helped their 16-year-old plan a “beautiful suicide.” Adam died in April.
Representatives of Microsoft did not respond to a request for comment.
If you or someone you know may be experiencing a mental health crisis or contemplating suicide or self-harm, call or text 988 for free and confidential support. You can also call San Francisco Suicide Prevention’s 24/7 Crisis Line at (415) 781-0500.
Michael McLaughlin can be reached at [email protected]
George Kelly can be reached at [email protected]

source

Scroll to Top