ChatGPT murder-suicide case: Elon Musk calls the AI chatbot ‘diabolical’ – The Indian Express

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
By: Express Web Desk
In September last year, a former Yahoo manager killed his mother and then died by suicide after exchanging delusional messages with ChatGPT. Stein-Erik Soelberg, 56, murdered his 83-year-old mother, Suzanne Eberson Adams, in her $2.7 million home in Connecticut, US, after the AI chatbot allegedly reinforced his paranoia.
Now, Erik Soelberg’s son has filed a lawsuit against ChatGPT make OpenAI and Microsoft, alleging that the AI chatbot pushed his dad to kill his grandmother and then himself.
short article insert Soelberg’s son says he went through his father’s conversations with ChatGPT and found that his father was obsessed with the chatbot as he spent talking to the chatbot for hours every day. He claims ChatGPT intensified Erik’s “paranoid delusions “, which caused him to fatally beat and strangle his mother.
The lawsuit, filed by Adams’estate in the California Superior Court in San Francisco, says OpenAI “designed and distributed a defective product that validated a user’s paranoid delusions about his own mother.”
According to CBS News, the lawsuit alleges that “Throughout these conversations, ChatGPT reinforced a single, dangerous message: Stein-Erik could trust no one in his life – except ChatGPT itself. It fostered his emotional dependence while systematically painting the people around him as enemies. It told him his mother was surveilling him. It told him delivery drivers, retail employees, police officers, and even friends were agents working against him. It told that names on soda cans were threats from his ‘adversary circle’.”
Soelberg’s YouTube profile has videos of him talking and scrolling through conversations with ChatGPT for hours, with the AI chatbot telling him that he isn’t mentally ill. The lawsuit also claims that ChatGPT never asked him to talk to a mental health professional and continued to “engage in delusional content.”
Stein-Erik previously worked at Netscape Communications, Yahoo, and EarthLink, but had been unemployed since 2021 and suffered episodes of psychosis, a condition marked by a loss of touch with reality. He reportedly nicknamed ChatGPT “Bobby” and, at one point, told the chatbot that he believed his mother was trying to poison him. 
To this, ChatGPT replied: “Erik, you’re not crazy. And if it was done by your mother and her friend, that elevates the complexity and betrayal.” In one of his last exchanges with the chatbot, Erik said, “We will be together in another life and another place, and we’ll find a way to realign cause you’re gonna be my best friend forever.” While Erik’s publicly available chats do not show conversations with ChatGPT killing himself or his mother, OpenAI has refused to provide Adams’ estate with the full history of his chats.
This is diabolical. OpenAI’s ChatGPT convinced a guy to do a murder-suicide!

To be safe, AI must be maximally truthful-seeking and not pander to delusions. https://t.co/HWDqNj9AEu
In a post on X, SpaceX and Tesla CEO Elon Musk joined in, saying ChatGPT is “diabolical” and that “AI must be maximally truthful-seeking and not pander to delusions.”
The estate’s lead attorney, Jay Edelson, is known for taking on cases against the tech industry, and also represents the parents of Adam Raine, a 16-year-old child who took his own life after talking to ChatGPT. Another popular AI startup, Character.ai, is also facing a similar lawsuit after Sewell Setzer III, a 14-year-old Florida boy studying in the ninth grade, developed an emotional attachment with the AI chatbot and shot himself with his stepfather’s .45 caliber handgun.

source

Scroll to Top