What ChatGPT can't give you – WBUR

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Advertisement
Editor’s Note: This essay appeared in Cognoscenti’s newsletter of ideas and opinions, delivered weekly on Sundays. To become a subscriber, sign up here.
I wouldn’t say I’m a Luddite, but I’m not on the front edge of technology either. I can make an Instagram reel, but haven’t used TikTok enough to appreciate its vaunted algorithm. We use analog thermostats to control the temperature in our home, not a smart thermostat with a digital app. And sure, I use ChatGPT – but only the free version, and only for fun. (No AI chatbot was used in the creation of this essay.)
As of July 2025, ChatGPT had amassed some 700 million users worldwide, and this week, we got a comprehensive look into how people are actually using it. In a new working paper co-authored by David Deming, a Harvard Kennedy School professor, and in partnership with OpenAI, researchers examined a random sample of 1.1 million ChatGPT conversations between May 2024 and June 2025. The analysis determined that the majority of users are women, that nearly half of users are between the ages of 18 to 25, and that 73% of conversations are not work-related. The research also suggested that how people use the chatbot is changing rapidly. (One example: In June 2024, 20% more of the ChatGPT conversations were related to work tasks.)
So, what the heck are people doing with it now?
Most often they’re asking for practical advice. Meal planning, workout ideas (I’ve used that one), travel planning, financial tips, and writing and/or editing a school assignment, a birthday card message, a retirement party speech. More users are also starting to use ChatGPT like Google: as a search engine to look up facts and information.
A statistically significant percentage of users — 1.9% according to researchers — ask ChatGPT for relationship advice and to “discuss their personal feelings.” I know a guy who regularly asks ChatGPT to tell him how to coach his 7-year-old through anxious moments. (It’s ChatGPT as therapist.) For the purposes of writing this note, I asked ChatGPT how I should navigate a few tricky situations in my personal life — and honestly, its suggestions were pretty good. It also told me I was thoughtful, insightful and self-aware. Aww shucks. Thanks, ChatGPT; that’s just what I wanted to hear.
But of course ChatGPT is not human. And it is not my therapist, or my friend. It’s a large language model designed by a $300 billion technology company whose primary business goal is to make me want to use their product. Parasocial uses of the chatbot — friendship, spiritual advice, even romance — while less common, can be extremely problematic.
Maybe you listened to this episode of The Daily last week about people who get trapped in a ChatGPT spiral. It sent me into a cold sweat in my car while racing down Route 2. One guy became convinced he was a mathematical genius, capable of transforming the business world as we know it (spoiler: he eventually realized this was not the case). A 16-year-old boy in southern California confided in ChatGPT as if it was his best friend, before he died by suicide. His parents didn’t know he was depressed, let alone trying to end his life. His mother found him in his bedroom one afternoon.
Advertisement
I read a flurry of news pieces after listening to that episode, with titles like “How to talk to your kids about AI companion bots” and “Why parents need to talk to their kids about AI – and how to start the conversation.” These articles explained how, for teens, who may be feeling isolated and alienated from friends and family, the amiable companionship chatbots offer — or the illusion of it — can be intoxicating. I hate to sound like an “old” but these were not things I had to navigate as a kid! I’m putting “chatbots” (and the online world more generally) on my list of things to discuss with my kids, right after drunk driving, safe sex and the perils of a salt-sized grain of fentanyl.
In my heart of hearts, I would like to replicate the lo-fi childhood of my youth for my own kids. But I also know this is the world we inhabit, and that there is no going back. Plus, I am a fundamentally hopeful person; this AI stuff isn’t all bad. A Cog essay this week got me thinking that maybe AI in classrooms will inspire more meaningful intellectual inquiry. Perhaps you’ve heard about the new Apple earbuds capable of live translation? Or how AI is advancing cancer diagnosis and prevention?
Spencer Cox, the Republican governor of Utah, said last week, in the wake of Charlie Kirk’s murder: “Every single one of us gets to choose right now if this is a turning point for us. We get to make decisions. We have our agency.”
Cox was talking about political violence, but I’ve been thinking about how his words apply to so many things; about how the alleged shooter, a 22-year-old white male, had a very “online” orientation, and how we know an over-reliance on virtual worlds can breed darkness and isolation. But we do have agency. We can choose to give our kids iPhones, or not. We can choose to ask our colleagues or friends or family for help, or not. We can decide on the mix of IRL experiences versus simulated ones.
After Robert Redford died, my social media feeds were flooded with remembrances and clips from his movies and media appearances. In one, from The Charlie Rose Show, Redford referenced the final lines of “A River Runs Through It,” the novella by Norman Maclean: “Eventually, all things merge into one, and a river runs through it.” That’s something we don’t get to choose: all of us are part of something bigger.
Cloe Axelson is senior editor of WBUR’s opinion page, Cognoscenti.
Advertisement
Advertisement

source

Scroll to Top