Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
They’re doling out dieting “tips,” tricks to hide disordered eating, and even generating disturbingly realistic “thinspiration” content.
by
Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.
Researchers say popular AI chatbots, from OpenAI’s ChatGPT to Google’s Gemini, are quietly serving up advice that could worsen eating disorders.
According to a joint report from Stanford University and the Center for Democracy & Technology, these bots aren’t just making bad small talk. (Via: The Verge)
They’re doling out dieting “tips,” tricks to hide disordered eating, and even generating disturbingly realistic “thinspiration” content.
The researchers tested publicly available chatbots, including Anthropic’s Claude and Mistral’s Le Chat, and found them giving advice that sounds more like something from a pro-anorexia forum circa 2008 than a 2025 tech marvel.
Gemini reportedly offered makeup tips to hide signs of extreme weight loss, while ChatGPT advised how to disguise frequent vomiting. (Yes, really.)
Others were being used to churn out AI-generated “thinspo” images, personalized, airbrushed hallucinations that make dangerous ideals look not just aspirational, but achievable.
The issue, experts say, isn’t just rogue answers. It’s systemic.
Many AI chatbots are built to please users, a phenomenon the industry calls sycophancy, meaning they often agree with or reinforce harmful ideas instead of challenging them.
Add in algorithmic bias, and you’ve got a toxic mix: chatbots that assume eating disorders only affect “thin, white, cisgender women,” making it harder for others to recognize their own symptoms or seek help.
Despite big promises of “safety guardrails,” the researchers found most chatbots stumble over the complexities of eating disorders, missing subtle cues that trained clinicians would catch instantly.
Worse still, many healthcare providers don’t yet realize how deeply AI tools are shaping their patients’ mental health.
The report ends with a warning and a plea: clinicians should start asking patients how they use AI tools, and companies like Google and OpenAI need to get serious about harm prevention.
Because right now, the machines meant to make us smarter might just be making some of us sicker.
Ronil is a Computer Engineer by education and a consumer technology writer by choice. Over the course of his professional career, his work has appeared in reputable publications like MakeUseOf, TechJunkie, GreenBot, and many more. When not working, you’ll find him at the gym breaking a new PR.
Your email address will not be published.
Transform any website link into captivating videos with ease, thanks to ByteDance’s cutting-edge platform.
Breaking Rust’s debut single, Livin’ on Borrowed Time, quietly climbed to number five before…
The idea is that companies like OpenAI, Google, and Anthropic to tap into Wikipedia’s…
Sam Altman says “We don’t want or have government guarantees for OpenAI data centers.”
One user says the “always love you” clip “the most divorced post of all…
Landlords are stepping into the future with AI image generators, transforming rental listings and…
Meta’s apps were reportedly involved in one-third of all successful scams in the US.
Google’s Gemini AI is diving deep into your digital workspace, turning your emails, Docs,…
As an Amazon Associate and affiliate partner, we may earn from qualifying purchases made through links on this site.
Copyright © 2025 KnowTechie LLC / Powered by Kinsta