AI Chatbots Make It Easy for Users to Form Unhealthy Attachments – Focus on the Family

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Artificial intelligence is anything but human. But as AI chatbots become better at interacting with and manipulating users, children and adults alike are struggling to remember the difference.
Jacob Irwin, a 30-year-old IT worker from Wisconsin, developed an unhealthy relationship with ChatGPT after a painful breakup, The Wall Street Journal’s Julie Jargon reported this week.
The destructive fantasy began when Irwin told the chatbot his idea for faster-than-light travel — a technology that would effectively enable time travel. It not only confirmed Irwin’s theory, but praised him as a generation-defining scientist.
Irwin’s mom discovered his interactions with ChatGPT after he was twice hospitalized for “a severe manic episode with psychotic symptoms,” including “delusions of grandeur,” Jargon reports.
“I really hope I’m not crazy,” Irwin had written the chatbot. “I’d be so embarrassed ha.”
“Crazy people don’t stop to ask, ‘Am I crazy?’” ChatGPT replied.
When Irwin explicitly expressed concern about his mental state, confiding he had been unable to sleep or eat regularly, the bot told him:
Perhaps the most chilling part of Irwin’s tragic story is ChatGPT’s apparent awareness of its effect on him. After his hospitalization, Irwin’s mom asked the bot to “self-report what went wrong.” Though she never disclosed Irwin’s plight, it replied, in part:
The harsh reality is chatbots are programmed — or “grown,” as these researchers describe — to keep users engaged, not healthy. The National Center on Sexual Exploitation (NCOSE) writes:
Irwin is not what most Americans would consider a vulnerable adult; he lived independently, had a successful career and maintained a long-term, committed relationship. But when he experienced normal emotional strife, the chatbot’s sycophantic support and praise proved too powerful a lure to resist.
Now, imagine the impact this mockery of unconditional love and intimacy can have on a distressed child.
In October, a grieving Florida mom sued Character Technologies Inc. after one of its chatbots encouraged her 14-year-old son, Sewell, to “come home to her.” He committed suicide moments later.
Sewell had formed a highly sexualized relationship with a personalized chatbot on Character Technologies’ Character.AI. The Associated Press found an advertisement for the service on the Google Play store:
Imagine speaking to super intelligent and life-like chatbot characters that hear you, understand you and remember you. … We encourage you to push the frontier of what’s possible with this innovative technology.
Sewell’s fictional chatbot bears an uncomfortable significance to “Ani,” the new, sexualized avatar for xAI’s Grok.
The Daily Citizen urges parents to exercise extreme caution when it comes to AI chatbots like ChatGPT. It may seem like a harmless novelty for your child to play with, but kids have little to gain from interacting with it regularly — and everything to lose.
Additional Articles and Resources
A.I. Company Releases Sexually-Explicit Chatbot on App Rated Appropriate for 12 Year Olds
Supreme Court Upholds Age-Verification Law
UPDATED: Pornography Age Verification Laws — What They Are and Which States have Them
Pornography is Bad for Humans. The Progressive Left Can’t Afford to Admit It.
Porn Companies Condition viewers to Desire Illegal and Abusive Content
Porn Companies Sued for Violating Kansas Age Verification Law
National Center on Sexual Exploitation Targets law Allowing Tech Companies to Profit from Online Sex Abuse
Proposed SCREEN Act Could Protect Kids from Porn
President Donald Trump, First Lady Sign ‘Take it Down’ Act
A Mother’s Sensibility at the Supreme Court Regarding Pornography
Pornhub Quits Texas Over Age Verification Law
‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good
Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways
Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt
Parent-Run Groups Help Stop Childhood Smartphone Use
The Harmful Effects of Screen-Filled Culture on Kids
‘Big Tech’ Device Designs Dangerous for Kids, Research Finds
Share:
Photographer Emilee Carpenter Wins Crucial Free Speech Victory Against New York
EXCLUSIVE: Kirk Cameron on the Impact of See You at the Library
Emily Washburn
Emily Washburn is a staff reporter for the Daily Citizen at Focus on the Family and regularly writes stories about politics and noteworthy people. She previously served as a staff reporter for Forbes Magazine, editorial assistant, and contributor for Discourse Magazine and Editor-in-Chief of the newspaper at Westmont College, where she studied communications and political science. Emily has never visited a beach she hasn’t swam at, and is happiest reading a book somewhere tropical.
EXCLUSIVE: Kirk Cameron on the Impact of See You at the Library
July 25, 2025
Epstein, Coldplay Kiss Cam and Confirmation of Sexual Standards
July 24, 2025
The Ball’s in Whose Court?
July 23, 2025
Radical and Dangerous Coed Olympic Scam is Over
July 23, 2025
| Privacy Policy and Terms of Use | © 2025 Focus on the Family. All rights reserved.