Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Opinion [Parmy Olson] You’re in relationship with AI
Published : Jan. 5, 2026 – 05:30:00
Link copied!
Amelia Miller has an unusual business card. When I saw the title of “Human-AI Relationship Coach” at a recent technology event, I presumed she was capitalizing on the rise of chatbot romances to make those strange bonds stronger.
It turned out the opposite was true. Artificial intelligence tools were subtly manipulating people and displacing their need to ask others for advice. That was having a detrimental impact on real relationships with humans.
Miller’s work started in early 2025 when she was interviewing people for a project with the Oxford Internet Institute, and speaking to a woman who’d been in a relationship with ChatGPT for more than 18 months.
The woman shared her screen on Zoom to show ChatGPT, which she’d given a male name, and in what felt like a surreal moment Miller asked both parties if they ever fought. They did, sort of. Chatbots were notoriously sycophantic and supportive, but the female interviewee sometimes got frustrated with her digital partner’s memory constraints and generic statements.
Why didn’t she just stop using ChatGPT?
The woman answered that she had come too far and couldn’t “delete him.” “It’s too late,” she said.
That sense of helplessness was striking. As Miller spoke to more people it became clear that many weren’t aware of the tactics AI systems used to create a false sense of intimacy, from frequent flattery to anthropomorphic cues that made them sound alive.
This was different from smartphones or TV screens. Chatbots, now being used by more than a billion people around the globe, are imbued with character and humanlike prose. They excel at mimicking empathy and, like social media platforms, are designed to keep us coming back for more with features like memory and personalization. While the rest of the world offers friction, AI-based personas are easy, representing the next phase of “parasocial relationships,” where people form attachments to social media influencers and podcast hosts.
Like it or not, anyone who uses a chatbot for work or their personal life has entered a relationship of sorts with AI, for which they ought to take better control.
Miller’s concerns echo some of the warnings from academics and lawyers looking at human-AI attachment, but with the addition of concrete advice. First, define what you want to use AI for. Miller calls this process the writing of your “Personal AI Constitution,” which sounds like consultancy jargon but contains a tangible step: changing how ChatGPT talks to you. She recommends entering the settings of a chatbot and altering the system prompt to reshape future interactions.
For all our fears of AI, the most popular new tools are more customizable than social media ever was. You can’t tell TikTok to show you fewer videos of political rallies or obnoxious pranks, but you can go into the “Custom Instructions” feature of ChatGPT to tell it exactly how you want it to respond. Succinct, professional language that cuts out the bootlicking is a good start. Make your intentions for AI clearer and you’re less likely to be lured into feedback loops of validation that lead you to think your mediocre ideas are fantastic, or worse.
The second part doesn’t involve AI at all but rather making a greater effort to connect with real-life humans, building your “social muscles” as if going to a gym. One of Miller’s clients had a long commute, which he would spend talking to ChatGPT on voice mode. When she suggested making a list of people in his life that he could call instead, he didn’t think anyone would want to hear from him.
“If they called you, how would you feel?” she asked.
“I would feel good,” he admitted.
Even the innocuous reasons people turn to chatbots can weaken those muscles, in particular asking AI for advice, one of the top use cases for ChatGPT. The act of seeking advice isn’t just an information exchange but a relationship builder too, requiring vulnerability on the part of the initiator.
Doing that with technology means that over time, people resist the basic social exchanges that are needed to make deeper connections. “You can’t just pop into a sensitive conversation with a partner or family member if you don’t practice being vulnerable (with them) in more low-stakes ways,” Miller says.
As chatbots become a helpful confidante for millions, people should take advantage of their ability to take greater control. Configure ChatGPT to be direct, and seek advice from real people rather than an AI model that will validate ideas. The future looks far more bland otherwise.
Parmy Olson
Parmy Olson is a Bloomberg Opinion columnist covering technology. The views expressed here are the writer’s own. — Ed.
(Tribune Content Agency)
Lee set for summit talks with China's Xi
President Lee Jae Myung was set to hold summit talks with Chinese President Xi Jinping on Monday, with the leaders expected to discuss coordination on North Korea, as well as ways to expand economic t
Venezuela crisis will have limited impact on S. Korean economy: govt
BTS to release 14-track full-length album on March 20
Six-decade math puzzle solved by Korean mathematician
What Georgia raid changed for journalists heading to CES
Guide to hidden rules of K-drama romance
K-drama Survival Guide
Faker, anomaly in esports
Stars Up Close
Herald Interview
A series of in-depth interviews.
K-parenting 101
Dive into the world of Korean parenting
K-pop 101
The world of K-pop explained, for both fans and newcomers
Inside Gen Z
Giving Korea’s new generation a voice
[Editorial] Hope rekindled
[Kim Seong-kon] Adieu 2025 and welcome 2026!
[Wang Son-taek] Seoul needs strategy over toughness
[Editorial] 2025: Year of recalibration
[Editorial] Still ambiguous
Six-decade math puzzle solved by Korean mathematician
Korea’s drama awards marked by joint wins, mourning and muted ratings
Data leak at Hybe’s Weverse exposes risks in K-pop’s global fansign lottery model
T.O.P teases return with new solo album after 13-year hiatus
Reduced risks, not breakthroughs aim of Lee‘s trip to China
Address : Huam-ro 4-gil 10, Yongsan-gu,Seoul, Korea Tel : +82-2-727-0114 Online newspaper registration No : Seoul 아03711
Date of registration : 2015.04.28 Publisher. Editor : Choi Jin-Young Juvenile Protection Manager : Choi He-suk
The Korea Herald by Herald Corporation. Copyright Herald Corporation. All Rights Reserved.