Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Art+Science
Biology + Beyond
Cosmos
Culture
Currents
Earth
Life
Mind
Ocean
One Question
Quanta Abstractions
Rewilding
Science at the Ballot Box
Science Philanthropy Alliance
Spark of Science
The Animal Issue
The Climates Issue
The Kinship Issue
The Porthole
The Reality Issue
The Rebel Issue
Women in Science & Engineering
ChatGPT and Claude can’t offer real empathy
The full Nautilus archive • eBooks & Special Editions • Ad-free reading
Last week, Sam Altman, the CEO of OpenAI, tweaked ChatGPT to make it act more like a “friend” again. The company had briefly tuned the dials to make its popular AI chatbot less “effusively agreeable,” after it guided a teenager named Adam Raine, who had become very attached to it, to take his own life. But users revolted when Open AI made the change, complaining ChatGPT now sounded like a robot, so Altman changed it back. “If you want your ChatGPT to respond in a very human-like way, or use a ton of emoji, or act like a friend, ChatGPT should do it,” Altman wrote on X.
We made ChatGPT pretty restrictive to make sure we were being careful with mental health issues. We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue we wanted to get this right.
Now that we have…
Lonely people everywhere are increasingly turning to AI chatbots like ChatGPT and Claude for friendship and psychological support. We are, after all, in the midst of a loneliness epidemic, and unlike humans, chatbots have an infinite amount of time to listen. But one of the pillars of friendship is empathy, the ability to share and understand the feelings of another person. Can a virtual machine living in the cloud serve up real empathy?
The answer to that question is complicated, says empathy researcher Anat Perry of the Hebrew University of Jerusalem. She spoke on a panel about human-AI relationships at a conference on minds, artificial intelligence, and ethics hosted by the Dalai Lama Library in Dharmasala, India last week. “When it says it feels your pain or it shares your experience, it’s just faking it,” explained Perry. Chatbots can express cognitive empathy, taking another person’s perspective, and motivational empathy, signaling that they want to alleviate the listener’s pain, she said. But they can’t offer affective empathy, the actual sharing of another person’s joy or pain, which comes from real-life experience.
Perry suspected that most humans already understand this and value the empathic support of a human more than that of a chatbot. To test her hunch, she ran an experiment in which she tricked her subjects. Perry and her colleagues asked 1,000 people recruited online to share a recent emotional experience. Half of the group was told they would get a response from ChatGPT and the other half from a human. In fact, all of the responses were AI-generated—but prompted to be highly empathic. When they rated the responses, people said they felt more positive emotions, and fewer negative ones, when they perceived the responder to be a human.
A second experiment showed that 40 percent of people were willing to wait up to two years for a response to an emotional experience from a human instead of getting an immediate response from a chatbot. Those who chose a human said, “they wanted someone who could truly understand them, share some of their emotions, care for them, and maybe even alleviate their loneliness.”
But that still leaves the other 60 percent, who were more interested in hearing from a chatbot right away. It’s a potentially concerning finding. While Claude, ChatGPT, and other chatbots might offer a temporary bandaid for humanity’s loneliness crisis, the more we turn to the machines, the less time we will have for each other. Ultimately, we may all realize there is no actual shoulder to lean on, no hand to wipe away the tears. We will have tumbled into a hall of machine-held mirrors. ![]()
Lead image: Vector Mine / Shutterstock
Posted on
Kristen French is an associate editor at Nautilus. She has worked in science journalism since 2013, reporting and editing features and news for publications such as Wired, Backchannel, The Verge, and New York magazine, among others. She studied science journalism at Columbia University. She is based in San Diego.
Space Is Raining Junk, and It’s Getting Worse
Remembering the Genius Who Inspired Celebration of the Mind Day
This Natural History Museum Heist Rivals the Louvre Robbery
Nautilus is a different kind of science magazine. Our stories take you into the depths of science and spotlight its ripples in our lives and cultures.
Cutting-edge science, unraveled by the very brightest living thinkers.
© 2025 NautilusNext Inc., All rights reserved.
You’ve read your 2 free articles this month. Access unlimited ad-free stories, including this one, by becoming a Nautilus member.
Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member.
Literary and always interesting.
To read this story, sign up for the free Nautilus newsletter: science and culture for people who love beautiful writing.