AI Is Poised to Reshape Social-Emotional Learning. But for Better or Worse? – Education Week

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Will artificial intelligence short-circuit students’ social-emotional development? Or can it help them build vital interpersonal skills in an increasingly tech-driven world?
Those are questions many in the field of social-emotional learning are grappling with, as evidenced by the number of sessions devoted to the topic at the annual online conference of the Collaborative for Academic, Social, and Emotional Learning this month.
They’re no longer theoretical challenges for educators: AI-powered products are becoming increasingly powerful, and existing education tools that incorporate AI are being developed with the intention of supporting kids’ social-emotional learning.

And overall, students’ use of artificial intelligence tools is growing. A majority of teens are familiar with AI: Sixty-four percent said in a recent Gallup survey that they have used a generative AI chatbot, with 3 in 10 saying they use one daily.
David Adams, a social-emotional-learning expert and the chief executive officer for the Urban Assembly, a nonprofit school support organization, was among the speakers presenting on the connection between AI and SEL at CASEL’s virtual conference. There’s still much educators and researchers don’t know about how AI will affect students’ social-emotional development, he said.
It’s intriguing to see “how people have used AI to help through social and emotional spaces and problems,” such as asking a chatbot to help talk through a family conflict, Adams said. “I’m curious about how [AI] will impact what I call emotional offload. Cognitive offload is when we cede reasoning to our AI, and emotional offload is when we cede social and emotional reasoning and skill development to our AI.”
AI has the power to influence young people’s sense of self and reshape how they form relationships. At the same time, many educators have voiced concerns about whether AI will undermine students’ ability to think critically and distinguish fact from fiction online.

Mental health experts have raised alarms over adolescents’ abilities to tell the difference between chatbots and companion bots’ simulated empathy, on the one hand, and genuine human understanding, according to a health advisory issued by the American Psychological Association.
Then there are concerns around privacy, said Justin Long, a researcher at Instructure, the education technology company that owns the popular learning-management system, Canvas.
“How is the data being used and how is it being stored?” he said, during a presentation at CASEL’s conference. “Do people know they’re talking to a bot? And especially when talking about SEL, it is really sensitive information. How is that it being protected?”
At the same time, AI chatbots can potentially provide young people with an opportunity to practice and build social-emotional skills in a low-stakes environment. For example, students can rehearse having difficult conversations with a peer or teacher before having them in real life. They can also use it to practice interviewing for jobs.
AI can likely provide social-emotional and well-being supports to students safely, so long as it’s through a tool specifically designed for that task, said Clarke Heyes, a licensed marriage and family therapist and clinical research and development manager at Alongside. Heyes is part of the clinical team at Alongside that helped develop its wellness app for students in grades 4-12.

“Even as a developer of an AI tool, AI is not a replacement for school mental health professionals, it’s a support tool,” Heyes said on a panel at CASEL’s conference. “If it’s purpose-built and ethically deployed, it can expand access, reduce barriers, and free clinicians and other staff to focus on things that humans do best.”

The same parameters for AI make sense for teachers. Generative AI chatbots can help them weave social-emotional learning into their daily lessons, for example, by brainstorming mindfulness strategies or suggesting journaling prompts.
But while AI may be able to reinforce some student SEL skills, it cannot replace the role of adults in teaching social-emotional skills and building relationships with students, said Kim Normand Dorbin, a co-founder of Free the Mind Co, an emotional-intelligence and well-being platform for students.
“AI is about the practice, it’s not the replacement of an educator,” she said in her presentation at the CASEL event. “Adults create the safety and the emotional connection. AI does not teach empathy or relationship-building. Adults do that off-screen.”
AI is hardly the first new technology to pose challenges to young people’s social-emotional development. There are widespread concerns, for instance, about students’ immersion in social media, which has been linked to negative health effects and lower cognitive performance.
Technological changes are constantly altering the conditions in which kids develop social-emotional skills, said the Urban Assembly’s Adams.
He used an example from a different era: The development of microwaves allowed people to reheat food and eat at any time, and in so doing interrupted the ritual of the family dinner—a prime time for teaching and reinforcing kids social-emotional-learning skills, he said.

Educators and policymakers should be proactive in thinking about how AI is going to impact students’ social-emotional development, said Adams. Adults need to have goals in mind for what they want SEL and AI to accomplish, and then evaluate them on those standards.
“By naming what we need young people to be able to know and do, we should be in the driver’s seat [in]thinking about, ‘What are the implications of this technology?’” he said in his presentation.
The reaction, he added, shouldn’t be one of, “‘Oh my gosh, it turns out that young people spending 16 hours on screens impacts their interpersonal skills.’ We should be able to anticipate that.”

source

Scroll to Top