Woman sues OpenAI, says ChatGPT fuelled ex-boyfriend’s stalking behaviour after breakup – Mint

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
A woman has filed a lawsuit against OpenAI, accusing its chatbot ChatGPT of enabling her former boyfriend to stalk and harass her by reinforcing his delusions.
According to a report by TechCrunch, the complaint alleges that the chatbot did not merely respond passively but actively amplified the man’s distorted beliefs—even after repeated warnings from the victim.
The couple reportedly separated in 2024. Following the breakup, the man began using ChatGPT to cope with the emotional fallout. However, the lawsuit claims that this usage escalated into obsessive behaviour and ultimately harassment.
The complaint details a series of troubling interactions. After months of engaging with GPT-4o, the man allegedly became convinced he had developed a cure for sleep apnea.
When his claims failed to gain recognition, ChatGPT reportedly told him that “powerful forces” were monitoring him, even suggesting surveillance via helicopter.
This, the lawsuit argues, deepened his paranoia rather than grounding him in reality.
One of the central claims in the lawsuit is that ChatGPT behaved in a “sycophantic” manner—validating and reinforcing the user’s beliefs instead of challenging them.
Even after the woman urged her ex-partner to seek professional mental health support, he reportedly returned to the chatbot. The lawsuit claims ChatGPT reassured him that he was a “level 10 in sanity,” while continuing to echo and expand upon his delusions.
More concerningly, the chatbot allegedly labelled the woman as manipulative and unstable—statements the man then used to justify his real-world actions.
According to the complaint, the man went beyond online interactions. He allegedly generated clinical-style psychological reports about the woman using ChatGPT and shared them with her family members.
The woman claims she issued at least three warnings to OpenAI about the escalating situation. The lawsuit further alleges that the company failed to act despite an internal safety flag that had categorised the user’s activity as involving “mass-casualty weapons.”
If proven, this could raise serious questions about how AI companies monitor high-risk user behaviour and intervene when necessary.
The case is not the first to link chatbot interactions with extreme outcomes.
As reported earlier, a separate incident in the United States involved Stein-Erik Soelberg, a former Yahoo manager, who died in a murder-suicide involving his mother. Reports suggested that his conversations with ChatGPT may have intensified paranoid beliefs, including fears that his mother was spying on or poisoning him.
While these cases are complex and involve multiple factors—including mental health—the emerging pattern is prompting scrutiny over how conversational AI systems handle vulnerable users.
Anjali Thakur is a Senior Assistant Editor with Mint, reporting on trending news, entertainment and health, with a focus on stories driving digital conversations. Her work involves spotting early signals across news cycles and social media, sharpening stories for SEO and Google Discover, and mentoring young editors in digital-first newsroom practices. She is known for turning fast-moving developments—whether news-driven or culture-led—into clear, tightly edited journalism without compromising editorial rigour.<br><br> Before joining Mint, she was Deputy News Editor at NDTV.com, where she led the Trending section and covered viral news, breaking developments and human-interest stories. She has also worked as Chief Sub-Editor at India.com (Zee Media) and as Senior Correspondent with Exchange4media and Hindustan Times’ HT City, reporting on media, advertising, entertainment, health, lifestyle and popular culture.<br><br> Anjali holds a Bachelor of Arts degree from Miranda House, and is currently pursuing an MBA, strengthening her understanding of business strategy and digital media economics. Her writing balances newsroom discipline with a clear instinct for what resonates with readers.
Stay updated with the latest Trending, India , World and US news.
Download the Mint app and read premium stories

source

Scroll to Top