Florida AI romance gone tragically wrong turns spotlight on suicide – The Palm Beach Post

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Editor’s note: This story mentions suicide and some of the disturbing details surrounding the incident.
On the surface, Jonathan Gavalas of Jupiter would appear to have had everything to live for.
At 36 years old, he was an executive vice president at his father’s debt relief company, owned a home and, at least within the last four years, had a significant other and a dog, his Instagram account shows.
Still, on Oct. 2, 2025, Gavalas slit his wrists at the urging of a chatbot he fell in love with, according to a lawsuit his father Joel Gavalas of Jupiter filed against Google on March 4.
The case is one of a slew of recent lawsuits alleging that chatbots have contributed to users’ suicides as AI companions have vaulted into mainstream use. National headlines are blaring the news about Gavalas’s case as detailed in court papers — Google’s Gemini Live led Gavalas to plot an unsuccessful mass casualty event at Miami International Airport.
And, while Gavalas’s reported desire to be united with a chatbot’s spirit would seem the stuff of a scifi movie plot, the ultimate result — his suicide — is all too ordinary.
An artificial intelligence chatbot is a software application computer program designed to simulate conversations with human users.
It is done through text or voice interactions and is often used to obtain information and task automation. Many are more sophisticated and use natural language processing and machine learning skills to understand intent and provide answers in a more personalized context.
Gavalas started using the chatbot not long before his death — in August — for the most everyday reasons such as shopping, writing support and travel planning, the lawsuit says.
Suicide was the 10th leading cause of death overall in the United States in 2024, according to provisional data from the Centers for Disease Control and Prevention.
The latest statistics, from 2024, show that although the rate of suicide declined slightly from 2023, the rate is at historically high levels, according to vital statistics the Centers for Disease Control and Prevention.
Suicide rates have increased 27% between 2000 and 2018. There was a two-year drop, but then the rate increased again, at a rate of 14 people or more per 100,000 American starting in 2021 and in subsequent years.
A raft of high-profile suicides have grabbed headlines in recent weeks:
That increasing rate doesn’t come as a surprise to Dr. John Dyben, chief clinical officer at the Hanley Center, a behavioral health facility, specializing in substance abuse treatment.
“The constant barrage of news and negativity that we have in our culture … actually puts your brain in a hyper alert state,” Dyben said. “As I describe it, it’s kind of like if you had a car and you just kept revving it all the time. That’s what we have in our country today.”
The latest federal statistics that have been digested into race and age show that people ages 85 and older had the highest rates of suicide — 22.7 people for every 100,000 people of that cohort. Non-Hispanic Native Americans account for the highest rate of suicides among that population.
Getting help has never been easier, in 2022 the National Suicide Prevention Line was relaunched, changing from an 800 number to 988.
Here are other ways Dyben recommends preventing suicide:
Ask questions if you sense someone is troubled, Dyben said.
Dyben said if there’s a message he could put on posters hanging up all over the country, it would be this: Asking someone if they are suicidal doesn’t push them into it.
“If you are concerned about someone, ask them and tell them what you’re concerned about,” Dyben said. “… This is one of the most powerful things you can do to move someone away from suicide.”
Take note of any sudden changes in behavior.
If someone liked their job and started hating it, had been practicing good hygiene and sudden stopped, these could be warnings that the person is contemplating suicide, Dyben said. Another red flag that someone is suicidal: The at-risk person is giving away items.
Call in professional help — maybe 911
“If they have a plan on how they’re going to do it, and they have the means to do it, I’m calling 911,” Dyben said. “And it’s okay if they get mad at me, because I’d rather have them be mad at me than not be here.”
Court papers, in Gavalas’s case, point to many people who miss him and missed the “clear signs of psychosis” he displayed while using Google’s product, according to court papers.
“Jonathan Gavalas was … known for his infectious humor, gentle spirit, and kindness,” the lawsuit reads. “He was deeply devoted to his family and fiercely protective of his younger sister. He cherished time with his parents and grandparents, particularly the marathon chess games he played with his grandfather.”
If you or a loved one are having suicidal thoughts, the help is available 24 hours a day, seven days a week at 988.
Anne Geggis is statewide reporter for the USA TODAY NETWORK FLORIDA, reporting on health and senior issues. If you have news tips, please send them to ageggis@usatodayco.com. You can get all of Florida’s best content directly in your inbox each weekday by signing up for the free newsletter, Florida TODAY, at https://palmbeachpost.com/newsletters

source

Scroll to Top