Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Our local reporters are on the ground covering the stories that matter most to Charleston and the surrounding area.
Receive stories each day by signing up for our free daily newsletter, which delivers the latest local news directly to your inbox every morning.
Help keep the City Paper free.
No paywalls.
No subscription cost.
Free delivery at 800 locations.
Help support independent journalism
by donating today.
Statehouse
An increasingly testy standoff between the Trump White House and state lawmakers of both parties came to South Carolina this week as a state Senate panel began moving two bills that aim to impose strict new regulations on artificial intelligence chatbots.
The bills, which supporters tell Statehouse Report are being combined into one package for a last-minute legislative push before the session ends May 14, would create what sponsors call legal guardrails for both adult consumers and children around popular chatbots such as ChatGPT and Claude.
The guardrails would include age verification and parental consent, restrictions on the storage and use of users’ personal data, and mandated notifications that chatbots are not licensed professionals like doctors, lawyers and therapists. Under the legislation, the state attorney general, county attorneys and private citizens could all sue for damages of up to $5,000 per violation.
“We’re coming to a point with this emerging technology that we’ve got to start flexing our AI legislative muscles,” Charleston Republican Sen. Matt Leber, a bill sponsor, told the panel on April 15. “This has very light guardrails, but it does put responsibility on the companies to make sure that their AI is not going to be used in a nefarious way.”
That approach puts Leber’s legislation in direct conflict with President Donald Trump, who has repeatedly pushed for federal legislation banning such bills, and reportedly twisted GOP leaders’ arms to kill similar bills in Florida and Utah.
“A patchwork of conflicting state laws would undermine American innovation and our ability to lead in the global AI race,” the White House said in a March 20 policy statement that called on Congress to pass a federal law protecting children and consumers.
University of South Carolina economist Joseph Von Nessen told Statehouse Report the White House’s concerns reflected textbook business worries about regulatory complexity.
To illustrate the issue, he said to imagine a call-center business that places calls into multiple states. In one state, chatbots are allowed. In another, they’re forbidden. And in another, they can only be used in very specific ways. Now, multiply that by 50 states.
“Navigating those sets of rules could lead to real inefficiencies,” Von Nessen said.
Regarding concerns about innovation being stifled as 50 states bring various regulatory frameworks forward, he said the worry is rooted in uncertainty.
“Uncertainty breeds paralysis,” he said, noting that that can be true whether the regulations under consideration are ultimately beneficial or not. “When businesses don’t know what to expect, they can’t make strategic plans on how to grow, adapt to the market and move forward.”
In an April 16 interview, Leber said he’s not unsympathetic to those concerns. But with Congress increasingly unable or unwilling to pass legislation on issues of public concern, he said, the states are being forced to step in.
“We can’t let the protection of South Carolinians be held in limbo because those guys can’t get their act together,” Leber said, noting that state action can help break the federal logjam.
“Oftentimes, when the states act, they begin to take the issue more seriously and start to move a little more quickly,” he said.
As for the need, Leber pointed to “horror stories” senators heard in April 15 testimony from educators and parents groups, including documented instances of user suicide and highly sexualized chats with minors.
One of those parents was Fort Mill resident Kimberly Long of Mothers Against Media Addiction, who argued that children needed explicit legal protections from the mental health risks she said AIs pose.
“The dangers embedded in the rise of AI and chatbots are not the arena in which to give kids free rein,” Long said.
Charleston-area clinical psychologist Viktoriya Magid told Statehouse Report she’s familiar with those concerns, though in her practice, the chatbot-related issues she sees tend to involve patients who don’t realize they’re “talking to themselves” when they chat with a bot. And that, she said, can lead to erroneous, garbage-in, garbage-out self-diagnoses.
“What people don’t understand is that it’s going to give you advice based on what you feed it,” Magid said. “So I have folks coming in thinking they’re having these insights when it’s really just their own blind spots repackaged with a diagnosis that’s usually wrong.”
That’s precisely the kind of problem supporters say the bill tries to address with regular notifications that chatbots aren’t professionals.
Nevertheless, several business groups, including bankers, video game manufacturers and customer service industry representatives, testified against the legislation as written.
Specifically, they argued that its definition of chatbot — “an algorithmic or automated system that generates information through text, audio, image, or video in a manner that simulates interpersonal interactions or conversations including artificial intelligence” — is too broad, and would create expensive liability risks for everyday businesses the bill isn’t aimed at.
Witness Tom Mann, Southern state policy manager of the Computer and Communications Industry Association, expanded on that concern in an April 15 interview, telling Statehouse Report the definition would likely capture educational tools, productivity applications, research assistants and more.
“These things don’t fall into the high-risk category that legislators seem to be focused on, like companionship or maybe romantic chatbots,” Mann said, noting that some states, like California, have written legislation with specific carve-outs for “safe” business chatbot uses.
Leber and other senators at Wednesday’s hearing said they were open to amendments to address businesses’ concerns, but warned advocates to get their proposed changes in quickly, because time is running short and the legislative train is on the tracks.
“Folks, I would just encourage you to work with the sponsor or staff to get those amendments in,” committee Chairman Sean Bennett, R-Dorchester, told business representatives pointedly as the hearing closed. “At the next meeting, we are going to be moving forward.”
Help keep the City Paper free.
No paywalls.
No subscription cost.
Free delivery at 800 locations.
Help support independent journalism by donating today.