University of Sussex study looks into AI therapy chatbots – BBC

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Artificial intelligence (AI) therapy "works best" when patients "feel emotionally close to their chatbot", according to a study from the University of Sussex.
With more than one in three UK residents now using AI to support their mental health according to Mental Health UK, university researchers say the study highlights the key to effective chatbot therapy, as well as the risks of "synthetic intimacy".
The research, published in Social Science & Medicine journal on Tuesday, is based on feedback from 4,000 users of Wysa, a mental health app prescribed under the NHS Talking Therapies programme.
The NHS has been approached for comment.
The study reported that users commonly referred to the app as a "friend, companion, therapist and even occasionally partner".
Researchers added users reported therapy was "more successful" when they developed emotional intimacy with their AI therapist.
NHS Trusts are using apps like Wysa and Limbic to aid self-referral and support patients on waiting lists.
However, the researchers also raised concerns about the growing phenomenon of synthetic intimacy – where people develop social, emotional or intimate bonds with AI.
University of Sussex Assistant Professor Dr Runyu Shi said: "Forming an emotional bond with an AI sparks the healing process of self-disclosure."
However, he also warned patients could risk being "stuck in a self-fulfilling loop".
"The chatbot fails to challenge dangerous perceptions, and vulnerable individuals end up no closer to clinical intervention," he explained.
Researchers said intimacy with AI is generated in a process described as a "loop", where users disclose personal information, then have an emotional response.
Users then develop feelings of gratitude, safety and freedom from judgement, according to the report.
Researchers said this could lead to positive changes in thinking and wellbeing, such as self-confidence and higher energy levels.
Over time this loop creates an intimate relationship, with human-like roles attributed to the app, researchers added.
University of Sussex Prof Dimitra Petrakaki said: "Synthetic intimacy is a fact of modern life now.
"Policymakers and app designers would be wise to accept this reality and consider how to ensure cases are escalated when an AI witnesses users in serious need of clinical intervention."
Researchers said chatbots were "increasingly filling the gaps left by overstretched services".
Hamed Haddadi, professor of human-centred systems at Imperial College London, previously told the BBC that chatbots were like an "inexperienced therapist", and that humans with decades of experience would be able to engage and "read" their patient based on many things, while bots are forced to go on text alone.
Another potential problem, said Prof Haddadi, is that chatbots could be trained to keep you engaged, and to be supportive, "so even if you say harmful content, it will probably cooperate with you".
Follow BBC Sussex on Facebook, on X, and on Instagram. Send your story ideas to southeasttoday@bbc.co.uk or WhatsApp us on 08081 002250.
Nvidia had been at the centre of a geopolitical tug-of-war between the US and China in recent months.
How do an artist, a videographer, a musician and a copywriter feel about generative AI?
Rail services were cancelled after a 'hoax' picture of a damaged bridge appeared on social media
BBC Sport hears from inside the England setup about how artificial intelligence is powering plans for next summer's World Cup.
The company has struggled to convince investors of the viability of the metaverse, a nascent technology.
Copyright 2025 BBC. All rights reserved. The BBC is not responsible for the content of external sites. Read about our approach to external linking.
 

source

Scroll to Top