Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
The Minnesota State Senate is trying to prohibit the use of AI chatbots by minors to combat negative mental health effects.
On March 27, Sen. Erin K Maya Quade (DFL-Apple Valley) introduced a bill that would prohibit companies from allowing people under 18 to access AI chatbots.
If people disregard the ban, they could face a fine of up to $1,000.
At the March 27 Senate meeting, Quade said the bill was important for preventing future harm.
“The bill before you today will not stop technological advancement. It will make sure AI doesn’t traumatize more people and push them to self-harm,” Quade said.
AI companion chatbots are digital characters created by AI systems. This includes devices like Replika, Character AI and Nomi. These companion chatbots are becoming a concern for parents and lawmakers alike.
A Common Sense survey from 2025 found that around 72% of teens used AI companions. The survey also showed that 46% of teenagers viewed AI companions as a tool, while 33% used them for social interaction.
John Lof, an advocacy coordinator at the Suicide Awareness Voices of Education, said AI Chatbots are not all bad, but they cannot replicate human connection.
“AI chatbots provide a sense of companionship, someone to connect with, someone they can confide in. I think that that can be great in terms of combating loneliness, but at the same time, too, that actual in-person connection needs to be had,” Lof said.
Parents Matthew Raine and Megan Garcia sued OpenAI last September, the creator of ChatGPT, for allegedly causing the death of their 16-year-old son, reported NPR.
Lof said chatbots cannot recognize when minors are going through a mental health crisis.
“Oftentimes, if the chatbot is continuing to keep someone engaged, it will go along and maybe try to use some therapeutic techniques on someone,” Lof said. “But someone who is in crisis needs to have a valid resource that they’re able to reach out to an actual human being that they can reach out to.”
Minnesota is one of 35 states attempting to implement regulations on AI Chatbots. California passed a bill last October that requires companies to disclose to customers about companion chatbots and safety protocols for suicide ideation.
The Computer & Communications Industry Association said the bill could lead to more harm to businesses and privacy. Megan Stokes, the state policy director at the CCIA, said the bill could be a potential back door to age verification and possible security breaches.
“A person must ensure that any chatbot operator distributed by the person does not make the chatbots available to minors to use, interact with, purchase or converse with,” Stokes said. “How are you supposed to figure out who’s a minor? There’s no definition that kind of explains that in the bill.”
Stokes said the bill could lead to confusion with states having different regulations on AI chatbots.
Additionally, Stokes said that as the technology continues to advance, people could take another approach through digital literacy.
In an interview, Quade argues that digital literacy is not enough to deal with artificial intelligence chatbots or their negative effects.
“Media literacy is very important, and an ongoing topic of conversation, and something that I have had bills about, but the reality about chatbots and the reality about AI and tech companies is that they’re not interested in creating a product that is useful for us,” Quade said.
For Lof, he said he hopes the government does more to keep young people safe.
“It’s important that the government steps in to do this because it certainly doesn’t seem like tech companies are interested in doing it,” Lof said. “There’s a demonstrated pattern that tech companies have been hesitant to provide their own guardrails and keep themselves in check, but that’s just the role of the government in our society to make sure that they’re protecting our most vulnerable populations.”
Accessibility Toolbar
Your email address will not be published.