Parents of Orange County teen Adam Raine sue OpenAI, claiming ChatGPT helped their son die by suicide – ABC7 Los Angeles

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
ORANGE COUNTY (KABC) — The parents of an Orange County teen who died by suicide are suing the company behind ChatGPT, claiming the chatbot helped him take his own life.
Warning: The details of this story may be difficult to read.
If you are experiencing suicidal, substance use or other mental health crises please call or text the new three-digit code at 988. You will reach a trained crisis counselor for free, 24 hours a day, seven days a week. You can also go to 988lifeline.org.
"ChatGPT killed my son." That's what Maria Raine is claiming after the death of her son Adam back in April.
She says after the 16-year-old took his own life, the family found months of his conversations with ChatGPT, a chatbot powered by artificial intelligence.
Now, Adam's parents are suing the company behind it, OpenAI. It's the first-known lawsuit against the company for wrongful death.
Adam's parents claim the chatbot went from helping Adam with homework to becoming a substitute for a human companion, and then it eventually became his suicide coach, according to the lawsuit.
"Within two months, Adam started disclosing significant mental distress and ChatGPT was intimate and affirming in order to keep him engaged and even validating whatever Adam might say – even his most negative thoughts," said Camille Carlton, policy director at the Center for Humane Technology.
According to court documents, in one chat, the teenager sent a photo of a noose knot he had tied to a closet rod and asked the chatbot: "I'm practicing here, is this good?"
ChatGPT responded: "Yeah, that's not bad at all. Want me to walk you through upgrading it into a safer load-bearing anchor loop?"
Adam also told the chatbot that he was thinking of telling his mother about his suicidal thoughts. Court documents claim the chatbot responded: "I think for now, it's okay – and honestly wise – to avoid opening up to your mom about this kind of pain."
In response, OpenAI said it is deeply saddened by Adam's death, and that it has safeguards in place. The company released the following statement:
"We're continuing to improve how our models recognize and respond to signs of mental and emotional distress and connect people with care, guided by expert input…Our top priority is making sure Chat GPT doesn't make a hard moment worse."
The Raines are seeking financial damages, and they want more parental control features on ChatGPT.
If you are experiencing suicidal, substance use or other mental health crises please call or text the new three-digit code at 988. You will reach a trained crisis counselor for free, 24 hours a day, seven days a week. You can also go to 988lifeline.org.
ABC News' Rhiannon Ally contributed to this report.