Scammers using AI to exploit emotional connections with victims – kjzz.com

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Now
63
Fri
65
Sat
64
by Jim Spiewak, KUTV
TOPICS:
As AI chatbots become more advanced, experts warn that the emotional bonds users are forming with artificial intelligence are not only real but can be exploited.
A recent survey by Fractl found that 22% of generative AI users reported forming an emotional connection with a chatbot. 3% described that relationship as romantic.
Anola Johnson said she lost $850,000 to two romance scams. She believed she was helping someone she loved.
“I don’t even know how I had access to that much money,” Johnson said. “When I realized it wasn’t real, it felt like the room was just spinning.”
She believes artificial intelligence played a role in making the scam feel legitimate, even if she was ultimately communicating with a real person.
Johnson isn’t alone. Utah Valley University's Emerging Tech Policy Lab director, Brandon Amacher, said younger users are particularly vulnerable because they don’t see AI as a tool.
“They view these chatbots as a companion, and there are serious emotional issues that arise from that,” Amacher said.
In the same Fractl survey, 16% of users said they turn to AI for therapy, and 19% treat it as a friend. That emotional reliance has become a blueprint for scammers.
“What happens when bad actors realize they can generate all this emotional content instantly?” Amacher asked. “You’re getting to a point where you can potentially trick very smart people if you’re good at social engineering.”
Even the creators of this technology are acknowledging its impact. OpenAI CEO Sam Altman recently posted on X that as age-gating expands, ChatGPT will allow more human-like interactions, including erotica for verified adults.
ChatGPT now claims nearly a billion monthly users, just two years after becoming the fastest platform ever to hit 100 million.
Amacher called that a potential danger for kids who use the platforms without adult oversight.
“Children don’t have the emotional calibration to deal with what it really is,” he said.
Utah lawmakers have already responded, passing six AI-related laws in the past two years. One of them, the Identity Abuse Law, makes it illegal to use AI to impersonate someone without consent.
Still, Amacher believes the laws will need to evolve just as fast as the technology.
“You might have to get it wrong the first time and pivot later, because if you wait, you’re already too late” he said.
Johnson hopes her story helps others avoid what she went through.
“I hope I financially survive this,” she said. “But I’m a fighter too.”
_____

source

Scroll to Top