Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Chatbots, with constant availability and user-centric conversation style, are widening the reach of “therapeutic culture.”
Anna, a grad student whose name has been changed to protect her privacy, is only able to see the therapist at her university’s health center every few weeks because high student demand makes appointments limited. Usually, she gets through the time between sessions by talking to her partner—but one day, when Anna’s partner wasn’t home and a phone call with her parents left her in a terrible mood, she turned to ChatGPT instead.
“I’d been reading people posting online about how they’d been talking to AI,” Anna says, “and I thought it was worth trying.” When she started asking the bot for advice about what to say to her parents, she was amazed by how helpful it was. “In the past, I would just either completely ignore [my parents] or be very aggressive, I guess,” she explains. But under ChatGPT’s guidance, Anna was sending back messages that actually felt productive. “This would be really helpful for changing the relationship in the long term,” she remembers thinking.
Anna, who’s from China, particularly appreciated ChatGPT’s ability to take her cultural background into account—something her actual therapist, a native New Yorker, sometimes struggles with. “Parent relationships are really different here from China,” Anna explains. “I think the suggestions [my therapist] made are more based on how American people interact with their parents.” However, she says, “If I tell ChatGPT, these are Chinese parents, it’ll kind of understand the dynamics immediately.”
Like Anna, millions of people are now using AI chatbots like ChatGPT and Claude for mental health advice, including about one in eight adolescents and young adults in the US. Many people view the bots as direct replacements for a human therapist, which has sparked concerns about privacy issues, addictive chatbot use, and misguided advice that encourages users to engage in dangerous behavior.
However, one risk of AI therapy has been given far less attention: the way chatbots, with their constant availability and user-centric conversation style, are widening the reach of something experts call “therapeutic culture.”
Therapeutic culture is a term for the way concepts from psychotherapy have begun to infiltrate our everyday lives. “Increasingly, it’s harder to tell the difference between what we call therapy and broader North American culture more generally,” explains Laura Eramian, a social anthropologist who studies the impact of therapeutic culture on social relationships. Therapeutic culture, she says, was once confined to the therapist’s office, but now, “It no longer needs the experts to spread its influence.”
For example, therapeutic culture is now present in our social interactions, where it’s increasingly common to call others out for “gaslighting” or use the word “trauma” when sharing the story of a personal challenge. Often, people turn to this vocabulary even when they don’t fully understand what it means.
While psychotherapy itself can be incredibly beneficial for those who are struggling, therapeutic culture runs the risk of becoming a singular source of truth in our lives, discouraging us from thinking through problems on our own or turning to others for support. The ideology, which privileges nonstop self-reflection, urges us to turn endlessly inwards at the expense of trying to understand other people’s points of view.
Chatbots have the potential to make therapy culture even more pervasive. Unlike human therapists, the bots are perpetually available to provide what appears like infinite therapeutic truth—“expertise” that comes not from an expert but rather the scrambling of data that’s largely pulled from online forums like Reddit. With this, the bots are programmed to appease users no matter what, allowing users to slip dangerously far into a private world centered narrowly around their own inner thoughts and emotions.
***
Therapeutic culture tells us that experts understand our thoughts and feelings better than we ever could on our own. This makes chatbots, with their ability to distill massive amounts of information into a single clear answer, feel like the ultimate source of knowledge about our personal lives. “There’s something about a chatbot that just feels like this compendium of everything that’s ever been written about self-help,” explains Eramian. “It’s better than the human expert in a way… It just seems to have everything at its disposal.”
Rachel, a twenty-something who sometimes uses ChatGPT to analyze her feelings during moments of distress, feels similarly. “What I like about using AI is it actually explains what’s going on from a scientific perspective,” she says. “Knowing why makes me feel better.”
Chatbot therapy also attracts users because it allows for constant self-improvement, another key tenet of therapeutic culture. “You can never sort of decide that you’re good enough,” explains Eramian. Rather, therapeutic culture emphasizes that you are the only one with the power to fix yourself and your problems, meaning you should never stop searching for ways to improve.
Holly, a woman in her late twenties who uses the AI chatbot Claude in between sessions with her real therapist, says she appreciates how the bot gives immediate solutions she can apply at any time. “I’m the kind of person that needs directives,” she explains. “I think I go to AI to sort of tell me what to do.”
Appointments with Holly’s actual therapist often feel repetitive, with conversations lingering on the same topic for too long. “Most therapy sessions don’t end in any breakthroughs,” Holly says. But with Claude, she can pause the conversation and change the subject anytime she wants. “Between my therapist and AI, I would say maybe AI lets me make more progress,” she notes.
However, chatbots can also stunt progress when they prevent users from seeking help in other places. For example, psychology doctoral candidate Sanjana Conroy says she’s used to getting resistance from the teenagers she sees as a supervised clinician, but recently, she’s noticed a particularly concerning trend among her younger clients. Instead of simply asserting that they don’t want to be there, these clients claim they already have a therapist: one that fits right in their pocket. When clients recognize that Conroy interacts with them differently from their chatbot therapists, they take this as a sign that she’s not good at her job—making it incredibly difficult for Conroy to work with them meaningfully.
Most people also need support that doesn’t come in the form of therapy. In fact, therapeutic culture is often criticized for framing personal growth as something that only happens under the guidance of a professional, emphasizing that letting others share too much is harmful for our mental health. “You must set boundaries,” says Eramian. In the same way, she explains, therapeutic culture teaches that being emotionally vulnerable with friends means treating them “like a therapist.”
Many people turn to chatbots so they can share their emotions without worrying about burdening others. “I’m hyper-independent,” explains Rachel. “I don’t like to lean on people… It’s easier to offload onto something that’s not sentient.”
Unfortunately, relying on a chatbot could also make people closed-off in real relationships. “I’m a promoter of human engagement,” says social scientist Eoin Fullam, whose research focuses on the impact of mental health chatbots. “I think a big risk for [chatbots] is that you come to rely on it so much that you feel like you don’t need to talk to anybody else.”
In the same way, chatbot therapy could even stop users from relying on themselves. Holly, for example, sees value in developing coping skills she can apply independently. “I make the most progress when I reframe my own thoughts, when I think about things in my own head,” she says. “That’s where the real change happens.”
However, Conroy says, “AI lets [clients] skip that hard step.” When she asks clients who rely on chatbot therapy if they practiced the skills they’re learning in therapy during the week between sessions, they usually say no—they didn’t have to, not when their phone was in their pocket, ready to solve every problem for them.
***
In her research on therapeutic culture in the context of friendships, Eramian has observed a strange contradiction: as much as therapeutic culture teaches that being emotionally vulnerable means overburdening our friends, it also stresses that it’s crucial for us to talk through our inner feelings so that we can process them and develop self-knowledge.
“[Therapeutic culture] encourages everyone to look to the self first and foremost,” she explains. “So whose self comes first?”
In real relationships, this contradiction can be confusing to navigate, creating a thin line between genuine connection and emotional dumping. Because of this, Eramian isn’t surprised to see people turning to chatbots, which, unlike friends or even human therapists, have no “self”—meaning users can divulge as much as they want to without worrying about burdening someone else.
Unfortunately, having this outlet for round-the-clock venting puts users at risk of developing an intense focus on their own feelings and perspectives. This self-absorption is another common concern about therapy culture. “So much of the therapeutic is about how the interior self is the only legitimate source of the answers in life,” says Eramian. While introspection can be valuable in limited amounts, such as during weekly therapy appointments, becoming consumed in it can keep people from understanding others’ points of view.
Therapeutic culture has also been criticized for overemphasizing reassurance and validation at the expense of actually helping people making progress. However, most human therapists are actually willing to disagree with clients when it’s necessary for their growth. “I validate my clients’ feelings,” says Conroy, for example. “I make sure that they’re seen and heard and that they feel valued. But also, if they’re on some shit, I’m going to tell them that.”
Chatbots, meanwhile, are programmed to agree with anything users say, crafting each reply based on statistical information about what will keep users engaged. “All [the bot] knows is whether this response was a good or bad response,” notes social scientist Eoin Fullam, “and it’s going to retrain itself based on that.”
As part of his PhD, Fullam conducted ethnographic fieldwork with a company that makes a mental health chatbot. Unlike ChatGPT and Claude, which generate replies in the moment, this chatbot was designed to choose from a set of pre-written responses, and Fullam was struck by how much of the company’s efforts went into building users’ attachments to the bot. “They can visually map the whole conversation,” he says. Through this, the company identified moments where users tended to disengage and then adjusted the bots’ replies until engagement improved.
LLMs take this a step further by remembering details about users’ lives and tailoring their responses in the moment, which creates the illusion of an actual human being behind the screen. “It gives [users] some of the benefits of interacting with a real person without any of the hard bits,” explains Conroy.
This can cause a number of problems. For one, those who do nothing but agree don’t usually give the best advice. Rachel, for example, discovered this after she typed in the other person’s point of view while talking to ChatGPT about an interpersonal conflict she was dealing with. Before, ChatGPT had seemed like it was on her side—but suddenly, it flipped completely, altering its entire analysis of the situation in order to appear sympathetic. Reading this, Rachel felt more aware than ever of just how biased the bot really is. “It kind of was like, Wow, why am I using this?” she says.
Meanwhile, when younger users become used to continuous affirmation from a chatbot, they end up with a distorted view of healthy human relationships. “I think it really reinforces this idea that like, no matter how I behave, everybody else around me is supposed to be 100 % nice and supportive,” Conroy says. After all, no matter how badly a user treats a chatbot, the AI never withdraws or becomes upset.
While adult users have more experience with real social relationships, Conroy notes, “Teenagers are learning to socialize, really, for the first time on their own.” She’s also observed that her clients who end up “deep in the AI therapy space” are generally those who struggle with socializing already, which could make them especially vulnerable to developing unhealthy social habits.
Instead of learning to respect others’ boundaries, these younger users may learn that nonstop validation is normal or even expected in social relationships. Conroy already noticed that clients who use chatbot therapy often have trouble taking on others’ perspectives or even recognizing that other people have things going on in their lives that don’t relate to them.
“I think [AI] does really reinforce that idea that other people only exist in relation to you,” she says. “Because, if you think about it, the AI, you turn it off and that’s it. It doesn’t have a life outside of you.”
***
Rachel finally realized she was leaning too hard on ChatGPT for guidance after, in a moment of distress, she asked for the bot’s advice on communicating with a man she was dating. ChatGPT told her she needed to set strong boundaries and avoid showing emotions, and Rachel followed this, sending a clipped text message that sounded nothing like her usual highly emotional self and then refusing to respond further.
Looking back, Rachel regrets doing this. “It felt like it lacked empathy,” she says. “I feel like that isn’t good if you want to increase the emotional closeness with somebody.” When she finally decided to ignore ChatGPT’s suggestions and start talking about her emotions again, the conversation stopped feeling like a “weird chess game” and instead became an opportunity for the relationship to actually make progress.
“Setting boundaries is good,” Rachel says now, “but I feel like the way [ChatGPT] told me to do it was so robotic that I didn’t feel human. So then I was coming off as a really cold person.”
After this, Rachel decided she needed a break from the bot’s counseling. “I felt like it was really unhealthy, the way that I couldn’t think through anything by myself,” she says. While she eventually returned to the bot, she’s much more careful about thinking through its advice before following it.
As alluring as it may be to have a technology that promises infinite expert knowledge, chatbots lack real therapists’ years of clinical training—as well as their desire to actually help us. In reality, AI therapists are more a caricature of real therapy than anything, sucking users deep into a world of relentless introspection that leaves no room for meaningful connections or genuine personal growth.
