Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Now
66°
Tue
80°
Wed
72°
Flood Watch through Wednesday morning for Muskegon, Oceana, Newaygo, Mecosta, and Montcalm counties. Additional rain may cause flooding.
by Jessica Harthorn
KALAMAZOO, Mich. — Millions of Americans are turning to AI for emotional therapy.
A recent report in Jama, found about 13% of young people, now use AI chatbots for mental health advice.
And in a different survey, 44% said they'd rather talk with a chatbot than family or friends.
Parents now have a lot of questions on the ethical and potentially harmful implications.
I talked to AI experts at Kalamazoo Public Schools who are using the tool safely in the classroom and hosting seminars about "Parenting in the Digital World."
For parents, AI chatbots seem science fiction, but at Kalamazoo Public Schools its bringing non-fiction to life.
"It's been interesting to watch, because there was great resistance and then finally acceptance," Alex Miller, Kalamazoo Public Schools Coordinator of Instructional Technology said.
Miller is tasked with helping teachers and students become AI literate.
"And with AI, students will be able to really drive the learning process in a way that is personalized for their needs, their learning needs, their learning styles," Miller said.
After a tough vetting process, KPS landed on a program called "Magic School AI" that protects student data and privacy.
"So what's cool about this particular tool, Magic School, is that teachers can launch a room and assign students particular tools, like a study bot, or a quiz me, or a text rewriter, or chat with a character," Miller said.
Miller showed News Channel 3 Anchor Jessica Harthorn an example of how second graders are using chatbots to talk to a book character, or ask a historical figure questions as part of their research.
"I've devoted my life to the struggle for equality and justice for all people, regardless of the color of their skin," the AI chatbot said when it responded to a question Miller asked.
"Wow, so it's kind of bringing the story alive," Harthorn said.
"Yeah, yeah, it's allowing for different levels of engagement, that were not possible before," Miller said.
But there is a dark side to AI chatbots being used by kids outside of school.
Parent Mustafa Nazari attended the KPS "Parenting in the Digital World" seminar, after he said his kids were asked personal questions by an AI chatbot at home.
"AI was asking a lot of different kinds of questions about her name, her age, all these things," Nazari said.
According to a 2026 Pew Research Center poll, just over half of U.S. teens said they've used chatbots for help with schoolwork, and 12% say they’ve gotten emotional support.
"What do you think about that?" Harthorn asked Nazari.
"That's true, because AI is very soft and AI is very supportive, not only for the kids, even I heard about the adults," Nazari said.
Miller says KPS is taking a proactive approach, teaching students to think critically about how they're engaging with AI as a companion.
"Does it concern you that kids are turning to AI for therapy?" Harthorn asked.
"Yeah, I think that oversight is essential, especially when it comes to mental health. You hear terrible stories of students getting bad advice and making really bad decisions. It's our goal to prevent that," Miller said.
Miller said these are the red flags for parents to watch out for, kids who start to pull away from the outside world, have an overreliance on the tool for decision making or homework, or use it constantly, as it could indicate an addiction.
Mental health experts said AI chatbots can also miss emotional cues, reinforce harmful beliefs, or fail to safeguard users in crisis.
"It is overall very generic. It does not understand the context of your life, your lived experience, your unique situation and background, something that only a trained therapist through interaction, in real time, with somebody would say. You can tell so much about a person face to face," Dr. Sue Varma, a Board-Certified Psychiatrist said.
Nazari said he's pleased KPS is getting in front of the issue.
"The kids should be raised in a way that they should be able to control what is right and what's wrong when they are using all these platforms," Nazari said.
Especially when a Pew Research Center poll finds teens tend to view the impact of AI on their own lives more positively than negatively.
2026 Sinclair, Inc.