Use caution with AI therapy, experts say – Global News

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Instructions:
Want to discuss? Please read our Commenting Policy first.
If you get Global News from Instagram or Facebook – that will be changing. Find out how you can still connect with us.
As artificial intelligence chatbots like ChatGPT have grown in popularity, some are turning to them for therapeutic purposes.
“It was probably about a year and a half ago that I started hearing rumblings of people using things like ChatGPT for mental health support,” said Lawrence Murphy with the Canadian Counselling and Psychotherapy Association Technology and Innovative Solutions Chapter.
Murphy says even recent updates to ChatGPT to better recognize suicidality and psychosis are not enough to make it safe. Roughly 1.2 million weekly usersc who talk to ChatGPT show signs of suicidal ideation, according to new data from OpenAI.
“The thing to understand about counselors and psychotherapists and our profession is that the absolute bedrock of it, the foundation of it is ethics and best practices,” he said. “These chatbots, these large language models, they have none.”
Watch the video above to learn more about why more people are turning to AI for mental health support.
The latest health and medical news emailed to you every Sunday.
The latest health and medical news emailed to you every Sunday.

source

Scroll to Top