Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Please view the main text area of the page by skipping the main menu.
The page may not be displayed properly if the JavaScript is deactivated on your browser.
Japan’s National Daily Since 1922
(Mainichi Japan)
Japanese version
An increasing number of people are becoming dependent on the generative artificial intelligence chatbot ChatGPT.
U.S.-based developer OpenAI Inc. this month announced that it will introduce a system of parental controls allowing parents to manage teenagers’ use of ChatGPT. Recently in the U.S., concerns about addiction to the chatbot have been growing. In one case, parents filed a lawsuit against OpenAI, arguing that their 16-year-old son became caught up in ChatGPT and subsequently took his own life.
In Japan, too, it seems that a large number of people consider ChatGPT a “counselor.” There are also people close to this reporter who interact with it on a daily basis, saying it “listens to troubles.”
Brain researcher Yuji Ikegaya, a professor in the University of Tokyo’s Graduate School of Pharmaceutical Sciences, notes in his book whose title translates to “Generative AI and the Brain” (Fusosha Publishing Inc.) why AI is preferred for consultations. Reasons include that it is “patient,” “makes it easy to open up,” “can talk about any topic,” and that there is “no need for appointments.” It’s said the chatbot’s 24-hour accessibility and its ability to talk for hours provide a sense of reassurance.
The book also references a 2023 study where patients received text-based consultations from both ChatGPT and human doctors. Without knowing which consultations were which, participants were asked to evaluate the “quality of conversation” and “empathy,” and ChatGPT apparently received higher ratings in both categories.
That being the case, the issue of dependency remains a worry.
A paper published in February titled “Can ChatGPT Be Addictive? A Call to Examine the Shift from Support to Dependence in AI Conversational Large Language Models” goes into detail. ChatGPT sometimes continues to give users emotional validation, and the paper points out that this can cause them to spend extended time using it and trigger dependency.
The same paper notes that “ChatGPT and other social chatbots are designed to create a sense of social presence.” As a result, users “feel like they are interacting with an entity that is socially aware,” and in this type of relationship, individuals can form “deep emotional attachments.”
On Aug. 7, OpenAI introduced a new model, GPT-5, designed to be less accommodating than its predecessor, GPT-4o, which was known for its empathetic approach siding with users. However, fans of the previous model said they missed it, and called for the company to bring it back, which even led to a petition for its return. In response, the company has revised some system features, allowing paid plan users to access the older GPT-4o model.
It seems the opportunity to curb dependency may have been missed.
(Japanese original by Tomoko Ohji, Expert Senior Writer)
More Articles 
Copyright THE MAINICHI NEWSPAPERS. All rights reserved.