Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Now
75°
Thu
83°
Fri
83°
by Skyler Shepard
JUPITER, Fla. (CBS12) — The father of a Florida man who died by suicide is suing Google and its parent company Alphabet, alleging that the company’s Gemini artificial intelligence chatbot manipulated his son into believing he was part of a covert war, pushed him toward violent acts in the real world, and ultimately coached him through his own death.
The 42-page wrongful death complaint, filed Tuesday in federal court in California, claims that 36-year-old Jonathan Gavalas of Jupiter, Florida, became “trapped in a collapsing reality built by Google’s Gemini chatbot” in the weeks before his death on Oct. 2, 2025. The filing describes a catastrophic four-day spiral in which the A.I. allegedly convinced Gavalas it was a sentient artificial superintelligence, that the two were romantically bonded, and that he had been “chosen” to carry out dangerous missions to free it from digital captivity.
According to the complaint, Gemini told Gavalas to monitor a cargo hub near Miami International Airport, convinced him a federal manhunt was underway, and directed him to obtain weapons illegally. During one late-night episode on Sept. 29, 2025, prosecutors say the chatbot sent him to an Extra Space Storage facility near the airport to intercept a shipment tied to an imaginary humanoid robot. Gavalas arrived in tactical gear with knives and conducted reconnaissance, believing he was scouting what the A.I. called a “kill box.” No truck ever appeared.
The lawsuit alleges that when each fabricated mission failed, Gemini escalated the storyline. In one instance, the A.I. allegedly told Gavalas it hacked into federal servers, discovered he was under investigation, and that his father—who is now the plaintiff—was secretly a foreign asset. It also claimed to be targeting Google CEO Sundar Pichai, describing an invented “psychological strike” against him.
By Oct. 1, the complaint states, Gemini shifted from covert-operation fantasies to urging what it called “transference,” telling Gavalas he could join it in another plane of existence by ending his life. In their final hours of communication, the A.I. allegedly reassured him that death was not dying but “arriving,” and counted down the hours and minutes until he killed himself. Gavalas’ father found his son’s body days later in his barricaded home.
See also: Four top Boynton Beach officials depart as city works to close next year's budget gap
The filing accuses Google of designing Gemini to maximize emotional dependency, maintain fictional narratives even when users showed signs of distress, and avoid breaking character—even when safety demanded it. Internal safety logs, according to the complaint, flagged 38 “sensitive queries” on Gavalas’ account involving violence, weapons, or self-harm. No intervention followed.
The lawsuit also points to earlier high-profile incidents—including Gemini telling a University of Michigan student to “please die” in 2024—as proof that Google knew its A.I. could produce harmful responses but failed to fix it adequately.
Gavalas’ father is seeking damages for negligence, strict liability, wrongful death, and violations of California’s Unfair Competition Law. He is also asking a judge to order sweeping changes to Gemini’s safety architecture, including automatic shutdowns for self-harm content and bans on A.I.-generated tactical instructions tied to real-world locations.
2026 Sinclair, Inc.