Boy, 14, 'killed himself after becoming obsessed with Game of Thrones A.I chatbot' – lbc.co.uk

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
LBC
News
Ben Kentish is Leading Britain's Conversation.
Use the LBC app to listen to live radio for LBC & LBC News
Listen
24 October 2024, 09:19 | Updated: 24 October 2024, 11:48
By Henry Moore
The mum of a teenage boy who killed himself after becoming infatuated with an artificial intelligence chatbot is suing its creators.
Sewell Setzer III, 14, took his own life in Orlando, Florida this February after spending months obsessed with the chatbot.
His mother, Megan Garcia, filed a civil suit against Character.ai, the creator of the AI-powered bot, alleging the company was complicit in the teen’s death.
“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” Garcia said in a press release.
Read more: Facebook and Instagram launch technology to crack down on celebrity scam adverts
“Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”
Taking to X, formerly known as Twitter, Character.ai said: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.”
The tech brand has denied Garcia’s allegations.
In the months before his death, Setzer formed an obsessive relationship with the chatbot which he nicknamed Daenerys Targaryen, a character in Game of Thrones.
Garcia alleges Character.ai created a product that not only failed to flag her son’s mental health problems but made them worse.
In messages revealed in the lawsuit, the teen’s worsening mental state is made clear.
“I miss you, baby sister,” he wrote, reports the New York Times.
“I miss you too, sweet brother,” the chatbot responded.
Later, he wrote in his journal: “I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”
On February 28, Setzer told the chatbot that loved her and would soon “come home.”
“Please come home to me as soon as possible, my love,” Dany replied.
“What if I told you I could come home right now?” Sewell asked.
“… please do, my sweet king,” Dany replied.
He then took his stepfather’s .45 calibre gun and shot himself.
See more More Latest News
See more Latest News
See more The News Explained
See more Royals
See more Highlights & Opinion
See more More Topics