Counseling center responds to rise of AI chatbots – The Lafayette – The Oldest College Newspaper in Pennsylvania

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Available 24/7, no awkward small talk and never tells you “no” — the artificial intelligence chatbot in your pocket may seem like everything a therapist isn’t. 
But the Lafayette College Counseling Center is hoping to push back.
“It might not challenge folks in a way that a friend, a psychologist, a mentor might,” said Fallyn Lee, the counseling center’s prevention coordinator, about the lack of human connection in AI tools.
The counseling center recently released an advisory to students warning against using artificial intelligence for mental health support. The advisory, citing studies that demonstrate a newfound young adult reliance on chatbots for companionship and health advice, advised students to prevent unhealthy human and AI relationships and emphasized protecting user data.
“We just encourage people to be curious, to be thoughtful, to be mindful in different ways,” Lee said.
Renee Cantwell, a licensed psychologist and founder of Easton Neuropsychology and Behavioral Services, said AI and other online health tools should only be used “in conjunction” with traditional therapy or telehealth.
“That’s a very sort of special interaction,” Cantwell said of patient-psychologist relationships. “AI cannot replicate that. It’s sort of one sided.”
Psychology professor Michael Nees explained that, unlike traditional tools built to perform specific tasks, chatbots generate text without being designed for any clearly defined purpose.
“People are treating the word machines as therapists and companions, often apparently without having a clear mental model of how LLMs actually work,” Nees wrote, adding that some research and anecdotal evidence suggest relying on AI for emotional support may be harmful.
Cantwell said AI use could lead to a myriad of issues, including privacy.
“You don’t necessarily know what your information is being used for or where it’s going,” Cantwell said, citing concerns stemming from a 1996 healthcare privacy act.
Multiple lawsuits have been opened against major tech companies in the past year, including against Google and OpenAI, for alleged wrongful deaths after users sought mental health help in AI, including children. One lawsuit alleged that ChatGPT encouraged the user to deliver the mass shooting at Florida State University last year.
National publications are crowded with articles exploring the seemingly deep, but often harmful, relationships that some have developed with AI chatbots.
Lee said that AI tools tended to be “overflattering” rather than challenge “maladaptive thought patterns,” while Cantwell raised concerns over chatbots “validating unhealthy behaviors,”
Psychology professor Chu Chu wrote in an email that AI tools “are developed in the industry or computer scientists, who do not have training in psychometrics or psychological measurement.” 
“It would be difficult to select high-quality tools that do not simply play into users’ confirmation bias, but to actually help them understand themselves,” she wrote.
Students seeking mental health services can schedule an appointment with the counseling center or use TogetherAll, a free and anonymous support group free to all Lafayette students.
Harrison Meyer ‘29 contributed reporting.

Your email address will not be published. Required fields are marked *



source

Scroll to Top