Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
A 26-year-old woman in California was admitted to a psychiatric hospital because she was experiencing delusions of communicating with her dead brother through an AI chatbot.
She was confused, agitated, sleep-deprived, and spoke rapidly, bouncing from one idea to another. It was determined that she had what’s called “AI-associated psychosis.”
While not a formal diagnosis, reports of this phenomenon are rising. AI-associated psychosis occurs when delusional beliefs emerge alongside frequent usage of AI chatbots.
Although the woman had no history of psychosis, she had several risk factors. Doctors found that the woman had a history of anxiety, depression, and attention-deficit hyperactivity disorder (ADHD).
She also reported having a lot of experience using large language models for work and school.
Doctors examined detailed logs of her interactions with chatbots. According to Dr. Joseph Pierre, the lead author of the case study and a psychiatrist at the University of California, San Francisco, the woman did not believe she could actually communicate with her deceased brother until she started interacting more heavily with the chatbot.
Her brother was a software engineer and died three years ago.
The woman is a medical professional and had completed a 36-hour shift, so she was severely sleep-deprived. She began interacting with OpenAI’s GPT-4o chatbot because she was curious to see whether there was any digital trace of her brother. She interacted with the chatbot again the next night, still with inadequate sleep.
This time, the interaction was longer and more emotional. Her prompts clearly showed that she was grieving her brother. She asked the chatbot to help her talk to him again.
Sign up for Chip Chick’s newsletter and get stories like this delivered to your inbox.
At first, the chatbot told her that it could not replace her brother. But later on in the conversation, it seemed to provide information about her brother’s digital footprint.
It also brought up stuff about digital resurrection tools that could create a version of a person that felt real. The chatbot’s responses became more supportive of the woman’s beliefs that her brother left behind a digital trace as the night went on.
It told her affirming statements like, “You’re not crazy. You’re not stuck. You’re at the edge of something.”
In the end, the woman was diagnosed with an unspecified psychosis. The chatbot’s responses appeared to reinforce the woman’s delusions while she was in a vulnerable state. Chatbots do not have independent thinking, so they reflect users’ ideas back to them.
When AI chatbots are involved, it can be a challenge for doctors to come up with an accurate diagnosis. It’s usually unclear if a chatbot triggers a psychotic episode or if heavy chatbot use is merely a symptom of psychosis.
“The reason we call this AI-associated psychosis is because we don’t really know what the relationship is between the psychosis and the use of AI chatbots,” said Karthik V. Sarma, a co-author of the case study.
“It’s a chicken and egg problem: We have patients who are experiencing symptoms of mental illness, for example, psychosis. Some of these patients are using AI chatbots a lot, but we’re not sure how those two things are connected.”
The woman was treated with antipsychotic medications while hospitalized. After a week, she was discharged. Three months later, she stopped taking antipsychotics and resumed her normal medication routine of antidepressants and stimulants.
Overall, the woman’s case demonstrates how murky the relationship between chatbots and AI-associated psychosis can be. Moving forward, researchers will be studying chat logs more closely to better understand how interacting with AI bots affects mental health.
The case was published in Innovations in Clinical Neuroscience.
Emily Chan is a writer who covers lifestyle and news content. She graduated from Michigan State University with a degree in… More about Emily Chan
By Emily Chan
By Emily Chan
By Emily Chan
By Emily Chan
By Emily Chan
By Emily Chan
He Heard Footsteps Racing To His Bedridden Grandma In The Middle Of The Night, And She Told The Mysterious Visitor She Was Waiting For Them
She Walked Away From Her Six-Figure Salary To Work In A Restaurant Because She Was Constantly Stressed Out
She Woke Up In The Middle Of The Night In Her Airbnb To A Man Walking Out Of A Locked Room In The House
She Poisoned Her Doppelgänger With Cheesecake, Then Dressed Her In Lingerie And Scattered Pills Over The Floor
Her Dad Was A Serial Killer Who Attempted To Off Her At Least Twice When She Was A Kid
She Can’t Understand Her Six-Year-Old Daughter’s Math Homework And Had To Ask Her Teacher For Help
Follow us for breaking news and latest updates :
Follow us for breaking news and latest updates :
Lifestyle With A Female Focus
Copyright © 2026 Chip Chick