Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Date:
Share post:
As artificial intelligence becomes embedded in colleges’ business and administrative operations, academic leaders are growing more skeptical about its role in teaching and learning.
With student AI use now pervasive in classrooms, libraries and residence halls, many faculty members and instructional staff say they increasingly feel as though they are competing with AI systems to educate students—often without clear institutional guidance. In response, educators are redesigning instruction and assessment to remain students’ primary source of guidance.
At Flagler College, the number of students requesting tutoring has steadily declined since 2023, when ChatGPT became widely available, Director of Library Services Trina McCowan notes.
New AI features in Flagler’s library databases let students query articles and refine research questions. Still, McCowan fears that fluent, polished AI output can mask skill deficiencies, raising questions about whether graduates are developing the skills employers need.
More on AI: Here’s how a new “arms race” has snarled cybersecurity
“What we’re seeing in writing and upper math classes is that students don’t have the basics down,” she says. “There are a lot of liberal arts professors who hate the thought of students using [AI] because it’s circumventing their ability to think critically.”
Some AI companies aggressively promote use among students, writes Matthew Connelly, a vice dean for artificial intelligence at Columbia University, for The New York Times.
Anthropic pays campus ambassadors to promote using its chatbot, Claude. Google offered free access to its AI tools throughout the academic year.
“In reality, A.I. companies seem to look at college students as a strapped customer base to hook when they are most stressed,” Connelly writes.
McCowan suggests colleges cannot compete with AI. Rather, they must clarify for students what resources they have unique access to on campus and better promote ethical framing and quality standards that extend beyond the correct answer.
“Instead of a total ban, we have to teach students how to use [AI tools] because this is the way the world is going,” she says. “We want our graduates to be able to use the tools that they’re expected to know. And we just need a nuanced approach to it.”
At DePauw University, English professor Harry Brown describes the difficulty of distinguishing student writing from AI-assisted prose. Rather than chasing certainty, he has shifted his assessments toward what he calls the “architecture of thought.”
“I’ve gotten into the practice of assessing their thinking as much as their writing, so that then when I do go into class or I meet individually with students, I get a sense of how they’re approaching the problem,” he says.
The emphasis moves from product to process, Brown continues, echoing long-standing shifts in fields like computer science, where program design matters more than code alone.
In Wake Forest University’s Master of Science in Management program, students now treat AI less as an authority and more as a collaborator whose output must be scrutinized. They analyzed their personality tests with AI to discover how individuals can best collaborate in a group setting.
Student have since learned how to validate, interrogate and reject AI’s output. “Our goal is to help students think in the absence of knowledge,” Assistant Dean Bren Varner says. “We want students to navigate these technologies as they rapidly evolve and navigate the use of that information, much like they will in their career.”
AI excels at summarizing texts, drafting essays and conducting basic analysis. Professors can adapt by elevating their curriculum.
Wake Forest’s management program now emphasizes skills traditionally developed years into a career, such as leadership, influence, communication and tolerance for ambiguity.
Students are assessed on experiential learning and simulations rather than by exams. This allows students to apply the technology in ways they would in a changing workforce.
“When more businesses are relying on AI to automate the entry-level analysis roles we typically helped fulfill, it changes the learning outcomes that we focus on in the program,” Varner says.
Brown from DePauw utilizes British historian Niall Ferguson’s learning framework; it’s segmented as a “cloister” approach to learning—deliberate, unplugged thinking—and a “spaceship” approach, where AI is used intentionally and transparently.
Students are expected to understand when technology supports or replaces their thinking. “I don’t think the spaceship will automatically cancel out the cloister, unless we allow it,” Brown says. “Colleges need to make spaces for both.”
222 Lakeview Ave Suite 800 West Palm Beach FL 33401
© 2025 Arc Network, LLC. All rights reserved.