#Chatbots

AI chatbots should not replace your therapist, research shows – Medical Xpress

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Sign in with
Forget Password?
Learn more
share this!
Share
Tweet
Share
Email
July 8, 2025
by
edited by Gaby Clark, reviewed by Andrew Zinin
scientific editor
lead editor
This article has been reviewed according to Science X’s editorial process and policies. Editors have highlighted the following attributes while ensuring the content’s credibility:
fact-checked
trusted source
proofread
Should AI chatbots replace your therapist? New research says “no.”
The new study exposes the dangerous flaws in using (AI) chatbots for mental health support. For the first time, the researchers evaluated these AI systems against clinical standards for therapists.
The research, recently published and presented at the Association for Computing Machinery Conference on Fairness, Accountability, and Transparency (ACM FAccT), was a multi-disciplinary collaboration including researchers at the Stanford Institute for Human-Centered Artificial Intelligence, Carnegie Mellon University, University of Minnesota Twin Cities, and University of Texas at Austin.
In recent years, more people are turning to AI chatbots, like ChatGPT, for mental health support because of decreasing access and increasing costs of mental health services.
“Our experiments show that these chatbots are not safe replacements for therapists. They don’t provide high-quality therapeutic support, based on what we know is good therapy,” said Stevie Chancellor, an assistant professor in the University of Minnesota Twin Cities Department of Computer Science and Engineering and co-author of the study.
Other findings included:
“Our research shows these systems aren’t just inadequate—they can actually be harmful,” wrote Kevin Klyman, a researcher with the Stanford Institute for Human-Centered Artificial Intelligence and co-author on the paper.
“This isn’t about being anti-AI in health care. It’s about ensuring we don’t deploy harmful systems while pursuing innovation. AI has promising supportive roles in mental health, but replacing human therapists isn’t one of them.”
In addition to Chancellor and Klyman, the team included Jared Moore, Declan Grabb, and Nick Haber from Stanford University; William Agnew from Carnegie Mellon University; and Desmond C. Ong from The University of Texas at Austin.
More information: Jared Moore et al, Expressing stigma and inappropriate responses prevents LLMs from safely replacing mental health providers., Proceedings of the 2025 ACM Conference on Fairness, Accountability, and Transparency (2025). DOI: 10.1145/3715275.3732039
Explore further
Facebook
Twitter
Email
Feedback to editors
11 minutes ago
0
2 hours ago
0
22 hours ago
0
23 hours ago
0
Jul 6, 2025
0
11 minutes ago
30 minutes ago
41 minutes ago
49 minutes ago
2 hours ago
4 hours ago
16 hours ago
16 hours ago
16 hours ago
17 hours ago
Jun 16, 2025
Apr 2, 2025
Mar 31, 2025
May 19, 2025
Apr 30, 2025
Jun 11, 2025
21 hours ago
19 hours ago
20 hours ago
21 hours ago
16 hours ago
17 hours ago
AI chatbots evaluated against clinical standards for therapy frequently failed to provide safe or appropriate mental health support, responding correctly less than 60% of the time compared to 93% for licensed therapists. Issues included unsafe crisis responses, discrimination, and advice contradicting established therapeutic practices, highlighting significant safety concerns.
This summary was automatically generated using LLM. Full disclaimer
Use this form if you have come across a typo, inaccuracy or would like to send an edit request for the content on this page. For general inquiries, please use our contact form. For general feedback, use the public comments section below (please adhere to guidelines).
Please select the most appropriate category to facilitate processing of your request
Thank you for taking time to provide your feedback to the editors.
Your feedback is important to us. However, we do not guarantee individual replies due to the high volume of messages.
Your email address is used only to let the recipient know who sent the email. Neither your address nor the recipient’s address will be used for any other purpose. The information you enter will appear in your e-mail message and is not retained by Medical Xpress in any form.

Get weekly and/or daily updates delivered to your inbox. You can unsubscribe at any time and we’ll never share your details to third parties.
More information Privacy policy
We keep our content available to everyone. Consider supporting Science X’s mission by getting a premium account.
Daily science news on research developments and the latest scientific innovations
The latest engineering, electronics and technology advances
The most comprehensive sci-tech news coverage on the web

source

AI chatbots should not replace your therapist, research shows – Medical Xpress

Get the most out of AI chatbots