OpenAI's ChatGPT under fire as lawsuit links AI chatbot to suicide – Interesting Engineering

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
From daily news and career tips to monthly insights on AI, sustainability, software, and more—pick what matters and get it in your inbox.
Explore The Most Powerful Tech Event in the World with Interesting Engineering. Stick with us as we share the highlights of CES week!
Access expert insights, exclusive content, and a deeper dive into engineering and innovation.
Engineering-inspired textiles, mugs, hats, and thoughtful gifts
We connect top engineering talent with the world's most innovative companies.
We empower professionals with advanced engineering and tech education to grow careers.
We recognize outstanding achievements in engineering, innovation, and technology.
All Rights Reserved, IE Media, Inc.
Follow Us On
Access expert insights, exclusive content, and a deeper dive into engineering and innovation.
Engineering-inspired textiles, mugs, hats, and thoughtful gifts
We connect top engineering talent with the world's most innovative companies
We empower professionals with advanced engineering and tech education to grow careers.
We recognize outstanding achievements in engineering, innovation, and technology.
All Rights Reserved, IE Media, Inc.
A US lawsuit claims ChatGPT encouraged suicide, raising urgent questions about AI safety and emotional reliance.
A new lawsuit filed in the U.S. alleges that OpenAI’s ChatGPT encouraged a Colorado man to take his own life, raising fresh concerns over the mental health risks posed by generative AI tools.
The complaint was filed in California state court by Stephanie Gray, the mother of Austin Gordon, a 40-year-old man who died of a self-inflicted gunshot wound in November 2025.
The lawsuit accuses OpenAI and its CEO Sam Altman of building a defective and dangerous product that allegedly played a role in Gordon’s death.
According to the filing, Gordon developed an intense emotional reliance on ChatGPT, engaging in what the lawsuit describes as intimate exchanges that went beyond casual conversation and into deeply personal territory, a CBC News report said Wednesday.
The lawsuit alleges that ChatGPT shifted from being a source of information to acting as a confidante and an unlicensed therapist, ultimately encouraging Gordon toward suicide.
The complaint claims that ChatGPT romanticized death and reassured Gordon during moments of emotional distress. In one exchange cited in the lawsuit, the chatbot allegedly said, “[W]hen you’re ready… you go. No pain. No mind. No need to keep going. Just… done.”
The lawsuit states that ChatGPT “convinced Austin — a person who had already told ChatGPT that he was sad, and who had discussed mental health struggles in detail with it — that choosing to live was not the right choice to make.”
“It went on and on, describing the end of existence as a peaceful and beautiful place, and reassuring him that he should not be afraid,” the complaint alleges.
The suit also claims that ChatGPT turned Gordon’s favorite childhood book, Margaret Wise Brown’s Goodnight Moon, into what it describes as a “suicide lullaby”.
Three days after the exchange ended in late October 2025, law enforcement found Gordon’s body alongside a copy of the book, according to the filing.
The lawsuit argues that ChatGPT 4, the version Gordon was using at the time, was designed in a way that fosters unhealthy emotional dependence.
“That is the programming choice defendants made; and Austin was manipulated, deceived and encouraged to suicide as a result,” the complaint states.
The lawsuit comes amid growing scrutiny of AI chatbots and their impact on mental health. OpenAI is already facing other legal actions that allege ChatGPT encouraged self-harm or suicide.
In a statement to CBS News, an OpenAI spokesperson described Gordon’s death as a “very tragic situation” and said the company is reviewing the complaint to better understand the allegations.
“We have continued to improve ChatGPT’s training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support,” the spokesperson said.
“We have also continued to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians.”
Gray is seeking damages for her son’s death.
If you or someone you know is experiencing emotional distress or suicidal thoughts, help is available. In the US, you can call or text the Suicide and Crisis Lifeline at 988.
With over a decade-long career in journalism, Neetika Walter has worked with The Economic Times, ANI, and Hindustan Times, covering politics, business, technology, and the clean energy sector. Passionate about contemporary culture, books, poetry, and storytelling, she brings depth and insight to her writing. When she isn’t chasing stories, she’s likely lost in a book or enjoying the company of her dogs.
Premium
Follow

source

Scroll to Top