Is Slingshot’s mental health chatbot safe? Its first study raises questions about evaluating AI – statnews.com

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Complete your personal information for a more tailored experience
Home
NEW: The STAT Mini
Take a break from the news and try our new weekly crossword.
By Mario Aguilar
Nov. 24, 2025
Health Tech Correspondent
Mario Aguilar
Mario Aguilar covers technology in health care, including artificial intelligence, virtual reality, wearable devices, telehealth, and digital therapeutics. His stories explore how tech is changing the practice of health care and the business and policy challenges to realizing tech’s promise. He’s also the co-author of the free, twice weekly STAT Health Tech newsletter. You can reach Mario on Signal at mariojoze.13.
Mental health chatbot developer Slingshot AI wants the world to believe that its smartphone app, called Ash, will do more good than harm. The evidence that the company offers, though, raises more questions than answers.
Founded in 2022, Slingshot has raised $93 million from investors and last summer launched Ash, touting it as “the first AI designed for therapy.” The company says Ash has been used by more than 150,000 people to help them manage everyday struggles like stress and anxiety. Ash is currently free to use by anyone.
Advertisement
Despite its momentum, Slingshot faces an uphill climb to earn trust: It recently complained to the Food and Drug Administration that high-profile news stories about tragedies allegedly tied to generative artificial intelligence products like ChatGPT have “skewed the public’s perception of risk associated with general wellness apps like Ash.” The company wrote that  “Ash can provide enormous benefit at low risk” by employing basic guardrails and transparency. Slingshot believes its safety systems can identify and appropriately respond to risky prompts. The app’s disclaimers state that Ash is not intended for people facing a mental health crisis.
STAT+ Exclusive Story
Already have an account? Log in
Already have an account? Log in
Monthly
$39
Totals $468 per year
Totals $468 per year
Starter
$30
for 3 months, then $399/year
Then $399/year
Annual
$399
Save 15%
Save 15%
11+ Users
Custom
Savings start at 25%!
Savings start at 25%!
2-10 Users
$300
Annually per user
$300 Annually per user
To read the rest of this story subscribe to STAT+.
Mario Aguilar
Health Tech Correspondent
Mario Aguilar covers technology in health care, including artificial intelligence, virtual reality, wearable devices, telehealth, and digital therapeutics. His stories explore how tech is changing the practice of health care and the business and policy challenges to realizing tech’s promise. He’s also the co-author of the free, twice weekly STAT Health Tech newsletter. You can reach Mario on Signal at mariojoze.13.
Tech is transforming health care and life sciences. Our original reporting is here to keep you ahead of the curve.
Your data will be processed in accordance with our Privacy Policy and Terms of Service. You may opt out of receiving STAT communications at any time.
By Lev Facher
By Jack Pemment
Advertisement
By Eric Boodman
By Jason Karlawish
By Kate Selig — Boston Globe

By Chelsea Cirruzzo, Casey Ross, and Sarah Todd

By Allison DeAngelis

By Lizzy Lawrence

By Brian Yang

By Katie Palmer
Share options
X
Bluesky
LinkedIn
Facebook
Doximity
Copy link
Reprints
A decade of reporting from the frontiers of health and medicine
Company
Account
More

source

Scroll to Top