Character.AI Limits Teen Chat Time to Enhance Safety Amid Mental Health Concerns – Букви

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Character.AI is developing a new experience for users under 18 that eliminates the ability to chat with personas. Gabby Jones/Bloomberg/Getty Images
As mentioned by CNN
Warning: this story touches on the topic of suicide. If you or a loved one is experiencing thoughts of self-harm or mental health issues, help is available. In the United States call or text 988 – the Suicide & Crisis Lifeline. Globally: The International Association for Suicide Prevention and Befrienders Worldwide have crisis-center contacts around the world.
The Character.AI chatbot platform will no longer allow teenagers to have long conversations with its AI-generated characters, the parent company Character Technologies said on Wednesday. This decision came after a series of lawsuits alleging that the app could influence suicidal thoughts and mental health issues among teenagers.
The company will implement the changes by November 25, and teens will be given a two-hour limit on conversations for now. Instead of open chats, teenagers under 18 will be able to create videos, stories, and streams with the characters.
“We do not take this step to remove open Character chat lightly – but we believe it is the right decision in light of questions raised about how teenagers interact with this new technology and how they should interact with it.”
The company emphasizes that user safety is its top priority. The statement notes that it is investing significant resources in our safety program and that it has released and continues to develop safety features, including resources to prevent self-harm and tools to safeguard underage users.
The decision was made after regulators’ inquiries and a review of the latest developments in technology and safety.
Limits and age verification are the new steps that Character Technologies plans to introduce in addition to what is already planned. It was also announced that an AI Safety Lab will be established under the leadership of an independent nonprofit organization, which will conduct safety research on artificial intelligence in the entertainment space. This will extend Character AI’s existing safety measures, including warnings for users to contact support if suicide or self-harm is mentioned.
In the context of the industry, such steps highlight growing attention to mental health and safety for teenagers in the world of artificial intelligence. Debates in this area continue, as companies across segments – from leading developers to platforms with interactive chatbots – seek a balance between innovation and responsibility to younger users.
This trend points to an important vector in the industry’s development: creating a safe environment for users of all ages without restricting the opportunities technology offers for learning and entertainment.
Related news for you:

source

Scroll to Top