Therapists Should Ask Patients About Their AI Use, New Paper Advises – InsideHook

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Earlier this year, a New York Times investigation by Jennifer Valentino-DeVries and Kashmir Hill uncovered something disquieting about the current state of mental health: a number of therapists reporting that their patients were experiencing delusional behavior as a result of their interactions with AI chatbots. Mental health professionals are not the only people witnessing this: WIRED recently chronicled legal efforts to rein in chatbots from encouraging psychologically troubling behavior.

With AI as a growing presence in many people’s lives, it isn’t strange to think that that is something to consider when treating mental health issues. And now, a paper published earlier this month in the journal JAMA Psychiatry makes that case more formally — making the case that “such conversations are essential.”

As the authors explain, they opted to focus on why therapists should ask their patients about how they are using AI as opposed to other elements of patients’ use of AI. “While professional guidance focuses on how clinicians should use AI tools, conversations with patients about their AI use receive less attention,” they write.

In an interview with NPR’s Rhitu Chatterjee, one of the paper’s authors — NYU’s Shaddy K. Saba — made the case that a nonjudgmental approach to AI use was important. Dr. Saba compared a therapist asking their patient about AI use to a therapist asking their patient about their use of different substances. And that seems logical — a patient in a relationship with a chatbot and a patient who has never used AI exhibiting similar symptoms might require very different approaches to treatment.
As NPR’s reporting on the new paper points out, its recommendations are not far removed from a more formal set of guidelines released last year. In November 2025, the American Psychological Association released a set of guidelines related to both AI chatbots and health applications in general. That report argued that chatbots were not a substitute for working with a therapist, but that they “may be appropriate as a supportive adjunct, not substitute, to an ongoing therapeutic relationship.” Whatever form it takes, it’s good to see a growing number of professionals exploring the myriad ways this technology intersects with mental health.
The Charge will help you move better, think clearer and stay in the game longer. Subscribe to our wellness newsletter today.
Subscribed? Log In.
Log In.
Suggested for you
News, advice and insights for the most interesting person in the room.
Sign up for InsideHook to get our best content delivered to your inbox every weekday. It’s free. And awesome.
Copyright © 2026 InsideHook. All rights reserved.

source

Scroll to Top