Experts issue warning over ‘AI psychosis’ caused by chatbots. Here’s what you need to know – The Independent

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Notifications can be managed in browser preferences.
Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in
Next article
AI psychosis is not a formal diagnosis
I would like to be emailed about offers, events and updates from The Independent. Read our Privacy notice
As artificial intelligence increasingly integrates into daily life, a new mental health concern, dubbed ‘AI psychosis‘, is beginning to emerge.
This phenomenon is characterised by distorted thoughts, paranoia, or delusional beliefs, reportedly triggered by interactions with AI chatbots.
Experts are cautioning that the repercussions can be severe, ranging from social withdrawal and neglect of self-care to heightened anxiety.
To shed light on this developing issue, Dr David McLaughlan, a consultant psychiatrist at Priory and co-founder of Curb Health, has outlined the nature of AI psychosis, identifying key warning signs of an unhealthy relationship with AI, and advising on when professional help should be sought.
“Psychosis is a state in which someone loses touch with reality,” explains McLaughlan. “It often involves hallucinations, such as hearing voices or seeing things that aren’t there as well delusions, which are strongly held beliefs that don’t match the evidence around them.
“For the person experiencing psychosis, these perceptions feel absolutely real, even though others can’t share them.”
McLaughlan explains that while the term ‘AI psychosis’ is not a formal diagnosis, it has recently been used to describe situations where the use of artificial intelligence appears to have blurred a person’s sense of what is real and what is generated.
“In the context of so-called AI psychosis, the warning signs are similar to any psychotic illness but may be coloured by digital theme,” highlights the psychiatrist. “Loved ones might notice the person becoming increasingly preoccupied with chatbots, algorithms, or online platforms. They may insist that an AI is communicating with them directly, sending hidden messages, or even controlling their thoughts or behaviour.”
Other red flags include withdrawing from family and friends, neglecting self-care, struggling to work or study, or showing unusual levels of anxiety, suspicion or irritability, adds McLaughlan.
The consequences of psychosis, whether linked to AI or not, can be very serious if left untreated.
“At its core, psychosis distorts reality. That can mean someone makes decisions based on beliefs that aren’t true, such as thinking an AI is guiding their finances, relationships, or even their safety,” says McLaughlan. “This can place them at risk of financial harm, social isolation, or conflict with family and colleagues.”
It can also have an emotional toll.
“Living with hallucinations or the belief that one’s thoughts are being controlled is frightening and exhausting,” says the psychiatrist. “Without help, people may become deeply mistrustful, withdraw from everyday life, or in some cases, put themselves in danger.
“In the most severe situations, untreated psychosis is linked to self-neglect, accidental harm, or suicide risk.”
“The key message for families is not to dismiss these beliefs as ‘just tech obsession’, but to recognise them as potential signs of an underlying mental health condition,” advises McLaughlan. “Early support from a GP or mental health professional can make a huge difference to recovery.”
There are several treatments for psychosis which could help.
“Treatment usually involves a combination of medication, psychological therapy, and practical support,” notes the psychiatrist. “The most common medicines are antipsychotics, which work by calming down overactive dopamine signalling in the brain which can reduce hallucinations and delusions.”
However, McLaughlan highlights that medication is only one part of the picture.
“Talking therapies such as cognitive behavioural therapy for psychosis help people challenge frightening thoughts and make sense of unusual experiences,” says McLaughlan. “Family interventions can also give relatives the tools to support recovery and reduce stress at home. Alongside this, support with housing, work, or education is often crucial to helping someone rebuild their life.
“We also encourage people to focus on the basics, good sleep, avoiding drugs and alcohol, and managing stress, since these can all trigger relapses.”
“We can’t always prevent psychosis entirely, because factors like genetics and brain chemistry play a big role, but we can reduce the risk,” says McLaughlan. “With so-called AI psychosis, prevention is often about how people interact with technology,.”
Maintaining healthy digital boundaries is key.
“Limit time spent immersed in chatbots or virtual platforms, and balance this with offline activities and social contact,” advises the psychiatrist.
He highlights that the most important message is that early intervention can stop unusual experiences from spiralling into full psychosis.
“If someone starts to believe AI is communicating with or controlling them, it’s vital to seek help quickly,” stresses McLaughlan. “The sooner we step in, the better the chances of recovery and of preventing longer-term illness.”
Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in