Misuse of AI chatbots in health care tops 2026 Health Tech Hazard Report – Association of Health Care Journalists

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Health Beat
Discover the latest developments in the most important topics in health care today.
Aging
Algorithms and
Health Disparities
Environmental Health
Firearm Violence
Health Equity
Health IT
Health Policy
Insurance
Infectious Diseases
Patient Safety
Mental Health
Medical Studies
Resources
Explore an extensive library, created and curated by AHCJ members, to help you cover the health care beat.
Health Data Preservation
State Insurance Guides
How I Did It
Tip Sheets
Why This Matters
Analysis
On-Demand Webinars
HealthNewsReview Criteria
Data
How I Pitched It
Helpful Links
Glossary
Freelancers
Advance your career with ideas, contacts and support from the AHCJ freelance network.
Pitching, Reporting and Writing
Running a Business
Tools and Apps
Freelance Market Guide
Networking
Awards, Grants & Fellowships
Freelance Directory
How I Pitched It
Training & Events
Network with fellow health care journalists and gain valuable insights you can use every day.
Webinars
Fellowships and Study Groups
Annual Conference
Workshops | Summits
Health Journalism 2026
HJ25 Recap
Awards for Excellence
Recognizing the best health reporting in print, broadcast and online media since 2004.
Mentorship Program
Connecting you with the support and resources you need to thrive in the health journalism field.
Local AHCJ Chapters
Network with your peers and access local resources by joining or starting a local chapter.
About AHCJ
Learn how we’re setting new standards for health care reporting and how you can be a part of it.
History
Principles & Policies
Board of Directors & Committees
Staff
Contributors
Advocacy
Commitment to Diversity
2024-25 Impact Report
Center for Excellence in Health Care Journalism
A supporting non-profit organization for AHCJ
Governance & Financials
Current Supporters

Share:
Photo by Matheus Bertelli via Pexels
The misuse of artificial intelligence chatbots such as ChatGPT, Gemini and Copilot in health care is the most significant health technology hazard for 2026, according to the nonprofit patient safety organization ECRI. Every year, the organization compiles a list of the top 10 hazards based on responses to member surveys, literature reviews, medical device testing in their lab and investigations of patient safety incidents.
Journalists can download an executive brief of the report for more information and to spur story ideas. The report identifies what ECRI considers the greatest potential dangers and offers recommendations to reduce the risks of patient harm.
The chatbots referenced earlier such as Gemini and Copilot are not specifically for health care applications, ECRI experts cautioned during a January webcast. “They’re not medical devices. They’re not FDA-approved and regulated for that purpose,” said Rob Schluth, a principal project officer of device safety at ECRI. However, because those tools are becoming integrated into our lives, “we’re finding that many people in health care or with health concerns are turning to tools like these for advice about medical conditions, or treatments, or other health care-adjacent questions, and that poses some risks.”
Besides looking up information on health conditions, clinicians may use the chatbots to identify potential treatment options for a patient or create notes. Hospital staff may use them to make purchasing decisions or for help writing reports, experts said. 
It’s not that the chatbots themselves “have suddenly turned dangerous,” said ECRI’s president and CEO, Marcus Schabacker, M.D., Ph.D., but that when a chatbot’s output “feels helpful and definitive, people start to rely on it without necessarily questioning it.” 
Large language models (LLMs) like these are designed to respond in a way that keeps users engaged, ECRI staff said during the webcast, and not challenge or correct flawed assumptions that may be input in queries. The chatbots also can make mistakes or fabricate or “hallucinate” information, as they have a bias to present information they consider a user wants to hear. They are also designed to sound definitive, and not to say, “I’m not sure” or “I’m sorry, I can’t help you with this,” Schabacker said. 
A big misconception is that LLMs understand what they’re saying, said Christie Bergerson, Ph.D, a device safety analyst with ECRI. Instead, they predict the next word based on patterns and data they were trained on, she said. They identify words that typically occur in conversations about a given topic and form them into sentences, she said. Responses are based on predictions and statistical probabilities. 
Chatbots still can be helpful for helping with brainstorming, getting background information, or explaining complex topics, the experts said, but users should verify information and “check in with a human expert before taking actions or making decisions based off an LLM’s response,” Bergerson said. 
AHCJ has covered concerns related to the use of AI chatbots and mental health through blog posts and a webinar last fall. 
Rounding out the top 10 list are these issues that can impact patient safety:
2. Unpreparedness for a “digital darkness” event, or sudden loss of access to electronic systems and patient information.
Cyberattacks, natural disasters, vendor outages and internal system failures all could potentially paralyze a health care facility, the report said, potentially delaying treatment or jeopardizing patient safety. Health systems should strengthen disaster recovery planning, including establishing downtime procedures, having reliable data backup processes and ensuring readiness through training and safety drills, the authors wrote.
3. Substandard and falsified medical products
Counterfeit products are reaching U.S. markets “with alarming frequency,” authors wrote, and those that do function as intended can cause harm. They encouraged health care providers to strengthen their supply chains, demand high-quality products and implement measures to protect patients and staff from flawed products.
4. Recall communication failures for home diabetes management technologies
Continuous glucose monitors and other devices have improved quality of life for people with diabetes, but harm can result if product recalls and updates do not reach patients and caregivers in a timely manner. Home users of such technologies should be proactive in identifying and responding to safety notices about their devices and apps, the authors said, and providers and manufacturers should provide clear product safety information. 
5. Misconnections of syringes or tubing to patient lines
Inappropriate connections of syringes or tubing to patient lines intended for different uses can lead to medications, solutions, IV nutrition or gas being introduced into the wrong line, with severe consequences. Report authors encouraged hospitals to adopt the use of safety connector devices.
6. Underutilizing medication safety technologies in perioperative settings
Medication errors can occur at several points before, during and after surgical procedures, the authors noted. The drugs administered are often opioids or other “high alert” medications. Health care organizations should incorporate tools like barcode medication administration systems, where health care workers scan a patient’s wristband and medication label to ensure they match.
7. Inadequate device cleaning instructions
Failure to properly clean and disinfect/sterilize reusable medical devices between uses can spread infection or lead to device damage or other harms, the authors said. Reprocessing can be made challenging by the wide variation in instructions provided by manufacturers. Health organizations should consider reprocessing instructions before placing orders.
8. Cybersecurity risks from legacy medical devices
Older software-based devices and systems no longer updated with sufficient cybersecurity protections provide an opening for hackers to exploit. Health systems might consider disconnecting such devices from their networks, using security tools to manage vulnerabilities or planning to replace the devices, the authors said.
9. Health technology implementations that prompt unsafe clinical workflows
Implementing health care technologies without users fully understanding how to use them can contribute to various patient harms, especially if users resort to unsafe workarounds. Health systems should conduct comprehensive workflow analyses before deploying new technology and institute comprehensive training programs, the authors said.
10. Poor water quality during instrument sterilization
Failure to maintain water quality during instrument disinfection/sterilization exposes patients to potentially infectious pathogens or can cause instruments to become corroded or spotted with residues. Health systems should routinely assess the cleanliness of processed devices and sample water quality. 
Minneapolis Health Journalism 2026 | May 27-30, 2026
Karen Blum
Karen Blum is AHCJ’s health beat leader for health IT. She’s a health and science journalist based in the Baltimore area and has written health IT stories for numerous trade publications.
Share:
Tags:
Pitching, Reporting and Writing
Running a Business
Tools and Apps
Freelance Market Guide
Awards, Grants & Fellowships
Networking
Freelancer Directory
A Typical Workday
History
Board of Directors & Committees
Staff
Contributors
Center for Excellence in Health Care Journalism
Advocacy
Commitment to Diversity
Governance & Financials
Current Supporters
© 2026 AHCJ
Terms of Service & Privacy Policy

source

Scroll to Top