Experts warn AI is making your brain work less – BBC

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
What was the last thing you asked an AI chatbot to do for you?
Maybe you asked it for an essay structure to help answer a tricky question, provide an insightful analysis of a chunky data set, or to check if your cover letter matches the job description.
Some experts worry that outsourcing these kinds of tasks means your brain is working less – and could even be harming your critical thinking and problem-solving skills.
Earlier this year, the Massachusetts Institute of Technology (MIT) published a study showing that people who used ChatGPT to write essays showed less activity in brain networks associated with cognitive processing while undertaking the exercise.
These people also couldn't quote from their essays as easily as those in the study who didn't use an AI chatbot.
The researchers said their study demonstrated "the pressing matter of exploring a possible decrease in learning skills".
All 54 participants were recruited from MIT and nearby universities. Their brain activity was recorded using electroencephalography (EEG), which involves electrodes being placed on the scalp.
Some of the prompts used by the participants included asking AI to summarise essay questions, track down sources as well as refine grammar and style.
It was also used to generate and articulate ideas – but some users felt AI wasn't very good at this.
Separately, Carnegie Mellon University and Microsoft, which operates Copilot, found people's problem-solving skills could diminish if they became too reliant on AI.
They surveyed 319 white-collar workers who used AI tools for their jobs at least once per week about how they apply critical thinking when using them.
They looked at 900 examples of tasks given to AI, ranging from analysing data for new insights to checking whether a piece of work satisfies particular rules.
The study found that higher confidence in the tool's ability to perform a task was related to "less critical thinking effort".
"While GenAI can improve worker efficiency, it can inhibit critical engagement with work and can potentially lead to long-term overreliance on the tool and diminished skill for independent problem-solving."
Schoolchildren in the UK were similarly surveyed for a study published in October by Oxford University Press (OUP).
It found six in 10 felt AI had negatively impacted their skills in relation to schoolwork.
So, with the massive explosion of AI use, are our cognitive skills at risk of decline?
Not necessarily, says Dr Alexandra Tomescu, a generative AI specialist at OUP who worked on the school survey.
"Our research tells us that nine in 10 students say AI has helped them develop at least one skill related to schoolwork – be it problem-solving, creativity or revision.
"But at the same time, about a quarter state that AI use made it too easy to do work for them… So [it's] quite a nuanced picture."
She adds that many pupils want more guidance on how to use AI.
ChatGPT, which has more than 800 million weekly active users according to boss Sam Altman, has published a set of 100 prompts for students designed to help them get the most out of the technology.
But Prof Wayne Holmes, who researches critical studies of artificial intelligence and education at University College London (UCL), says this isn't enough.
He wants much more academic research to be done about the effects of AI tools on learning before pupils and students are encouraged to use them.
He tells us: "Today there is no independent evidence at scale for the effectiveness of these tools in education, or for their safety, or even for the idea they have a positive impact."
Prof Holmes points to research about cognitive atrophy, where someone's abilities and skills become worse after using AI.
He says this has been a problem for radiologists who use AI tools to help them interpret X-rays before they diagnose patients.
A study by Harvard Medical School published last year found AI assistance did improve the performance of some clinicians but damaged others for reasons researchers don't fully understand.
The authors called for more work to be done on how humans interact with AI so we can figure out ways of using AI tools that "boost human performance rather than hurt it".
Prof Holmes fears that students, whether in school or university, could become too reliant on AI to do their work for them and not develop the fundamental skills an education provides.
A student's essay might receive better marks thanks to help from AI but the issue is whether they end up understanding less.
As Prof Holmes puts it: "Their outputs are better but actually their learning is worse."
Jayna Devani, who leads international education at OpenAI – the company that owns ChatGPT – and helped secure a deal with the University of Oxford, says the firm are "very aware of this debate right now".
She tells the BBC: "We definitely don't think students should be using ChatGPT to outsource work".
In her view, it's best used as a tutor rather than just a provider of answers.
The example she gives is of a student having a back and forth with ChatGPT using the study mode setting.
You enter the question you're having difficulty answering and the chatbot can break down its components and help you understand it.
The example she gives is of a student doing an assignment late at night about a topic they don't quite understand.
"[If] you have an upcoming presentation to give and… it's midnight, you're not going to email your [university] tutor and ask for help," she says.
"I think the potential is truly there for ChatGPT to accelerate learning when it's used in a targeted way."
But Prof Holmes insists that any student who uses AI tools should be aware of how its reasoning works and how the companies providing them handle data. He stresses that results should always be checked.
"It is not just the latest iteration of the calculator," he says, describing AI's far-reaching capabilities and implications.
"I never say to my students, you shouldn't use AI… But what I do try to say is look, we need to understand all these different things about it so that you can make informed decisions."
Andrew Bailey says workers need to be trained to move into jobs that use AI, but adds it might not lead to mass unemployment.
The Artificial Intelligence Security Institute (AISI) says the tech is being used by one in 25 people daily.
Osborne said it was a privilege to become managing director of OpenAI for Countries based in London.
A Better Paradise is a dystopian vision of the near future in which an AI-led computer game goes rogue.
The errors included getting dialogue wrong and incorrectly claiming a scene was set 100 years earlier than it was.
Copyright 2025 BBC. All rights reserved. The BBC is not responsible for the content of external sites. Read about our approach to external linking.
 

source

Scroll to Top