OpenAI data estimates over 1 million people talk to ChatGPT about suicide weekly – ABC7 San Francisco

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
SAN FRANCISCO (KGO) — A new report from OpenAI revealed that an estimated 1.2 million people a week have conversations with ChatGPT that indicate they are considering taking their own lives.
The figure comes from its parent company, OpenAI, which says around 0.15% of users active in a given week have conversations that include explicit indicators of potential suicidal planning.
Earlier in October, OpenAI CEO Sam Altman announced that ChatGPT has reached 800 million weekly active users.
TAKE ACTION: Suicide Prevention: Local resources for those in crisis
The company says its tools are trained to direct people to professional resources such as crisis helplines, but admits this doesn't happen 9% of the time.
This report comes as both OpenAI and Character.ai face scrutiny, backlash and lawsuits over teen suicides.
In August 2025, a California family filed a lawsuit against OpenAI over the death of their 16-year-old son, Adam Raine, alleging its chatbot, ChatGPT, encouraged him to take his own life. It is the first legal action accusing OpenAI of wrongful death.
MORE: FTC investigating AI 'companion' chatbots amid growing concern about harm to kids
Similarly, lawsuits from New York to Colorado were launched by multiple families, suing Character Technologies, Inc., the developer of Character.AI backed by Google Ventures, alleging that their children died by or attempted suicide and were otherwise harmed after interacting with the company's chatbots.
In the new report released on Monday, OpenAI says it recently updated ChatGPT's default model to better recognize and support people in moments of distress.
In the same week, Character.AI announced Wednesday that it is tightening its chatbot safety rules by completely removing the ability for users under 18 to engage in open-ended chat with AI on its platform.
It said it would gradually wind down access, starting with a two-hour daily limit that will gradually decrease to zero.
MORE: Gov. Newsom vetoes bill to restrict kids' access to AI chatbots
This change is set to take effect no later than Nov. 25.
In their release, a note was directed to the company's under-18 users, saying, "To our users under 18: We understand that this is a significant change for you. We are deeply sorry that we have to eliminate a key feature of our platform… We do not take this step of removing open-ended Character chat lightly – but we do think that it's the right thing to do given the questions that have been raised about how teens do, and should, interact with this new technology."
OpenAI says its latest update, the new GPT5 model, has reduced undesired answers in the self-harm and suicide category by 52% compared to GPT4o (n=630).
"We've continued improving GPT5's reliability in long conversations. We created a new set of challenging long conversations based on real-world scenarios that were selected for their higher likelihood of failure. We estimate that our latest models maintained over 95% reliability in longer conversations, improving in a particularly challenging setting," says the OpenAI report.
If you are experiencing suicidal, substance use or other mental health crises please call or text the new three-digit code at 988. You will reach a trained crisis counselor for free, 24 hours a day, seven days a week. You can also go to 988lifeline.org.

source

Scroll to Top