AI chatbot privacy concerns grow as users share personal details – WDBJ7

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
ROANOKE, Va. (WDBJ) – A recent study found around one-third of AI app users are having deeply personal conversations with chatbots, using them as substitute therapists.
A separate Stanford study found six leading US AI companies all feed user inputs back into their models for future training and selling of their data through ads.
Apps like Okara AI have been gaining attention for its “Think before telling AI everything” campaign, positioning itself as a privacy-first alternative to platforms like OpenAI and Google.
Here @ Home sat down with Donna Wertalik, a VT professor and host of Voices of Privacy, to explain what companies are doing.
Some companies are adding guardrails so these tools support people without pretending to replace real professionals.
Large platforms are improving how they respond when users show signs of distress and are more often pointing people to real human help, like crisis lines or professional support.
Others are designing tools meant to work alongside therapists, not replace them.
Consumers should use these tools carefully and take simple steps to stay safe.
Wertalik said users should use digital tools as a support, not a replacement for real care.
She said users should avoid sharing deeply personal details like full names, addresses, medical history or details about others.
Users should not assume conversations are private or protected like medical records.
Wertalik said consumers should review privacy and data settings inside apps and platforms.
She said users should turn off data sharing when possible, limit chat history storage and opt out of training or personalization features if available.
She said users should avoid using these tools during a crisis.
Wertalik said if something feels serious or overwhelming, users should reach out to a trusted adult, licensed professional or a crisis hotline.
Experts warn that many tools store conversations and are not regulated the same way doctors or therapists are, so users need to stay aware and cautious.
Copyright 2026 WDBJ. All rights reserved.

source

Scroll to Top