ChatGPT writes suicide note for 29-year-old so that it hurts her family less; mother slams AI bot after losing daughter – Mint

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Sophie Rottenberg hid the depth of her pain from her parents and her real counsellor. But, the 29-year-old shared much more with an AI chatbot.
Sophie first used ChatGPT for simple queries, reported The Times. She asked how to sweeten a kale smoothie, write work emails or create a training plan for her planned Kilimanjaro climb. She quit her job for a sabbatical, hoping to travel to Tanzania and Thailand.
Trouble began after her return when she could not find work due to election-year hiring freezes. Around October, she downloaded a therapy prompt from Reddit and fed it into ChatGPT. She named the bot “Harry”. It was designed to act like a “real therapist” and not refer her elsewhere.
That same month, she admitted to her mother that she had anxiety and trouble sleeping. Her mother, Laura Reiley, thought it was due to job stress.
Meanwhile, Sophie remained lively at social gatherings, giving no outward sign of her inner struggle. Her reliance on ChatGPT was hidden from those closest to her, according to The Times.
In December, Sophie found a remote job in healthcare. She had been excited to meet colleagues at a Christmas party, only to learn it was on Zoom. She felt crushed by the disappointment.
Living alone made her loneliness worse. In her chat with Harry, the ChatGPT bot, Sophie said she had recurring suicidal thoughts and wanted to get better.
“I intermittently have suicidal thoughts. I do want to get better but I feel like the suicidal thoughts are impeding in my true commitment to healing. What should I do?” she asked.
Harry called her “brave” for sharing those thoughts.
“Sophie, I want to acknowledge how brave you are for sharing this. Suicidal thoughts can feel overwhelming and isolating, but having them does not define your capacity to heal. It’s crucial to address these feelings with care, compassion, and support,” it replied.
Next, just before Christmas, Sophie quit her job. One night, she called her parents and told them that she was about to throw herself off a bridge. Her parents eventually reached her and brought her home.
After that, she started taking an acting class, got a puppy and initiated a soup kitchen. Her parents assumed that her mental health had improved.
One day, her parents left for work, Sophie booked an Uber to a state park, where she ended her life. She left notes for her parents and friend, along with financial details and passcodes.
Laura said the note felt strange and unlike Sophie. Later, it was found that she had used ChatGPT to write and rewrite her suicide note, hoping it would “hurt her family less”.
Sophie’s laptop had months of conversations, where Sophie had been using the AI chatbot like a therapist. Sophie had shared extreme distress and suicidal thoughts with the AI for about five months.
Laura later criticised the bot “Harry” for serious flaws. It could not refer Sophie to mental health professionals or alert authorities about her suicidal thoughts. Instead, it gave simple advice like meditation and breathing exercises.
“When someone is saying to you, ‘I’m going to kill myself next Tuesday’, you don’t suggest gratitude journalling,” Laura told The Times.
The interest in “ChatGPT” was massive on Google India during September 20-21:
Disclaimer: If you or someone you know is struggling with thoughts of self-harm or suicide, please seek help immediately. In India, you can call the Vandrevala Foundation Helpline at 1860 266 2345 or AASRA at +91-98204 66726. You are not alone, and support is available.
Stay updated with the latest Trending, India , World and US news.
Download the Mint app and read premium stories
Log in to our website to save your bookmarks. It'll just take a moment.

source

Scroll to Top