Ottawa planning measures to protect young and vulnerable from AI chatbots, minister says – The Globe and Mail

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Federal Identity Minister Marc Miller said there are ‘legitimate interests’ in protecting people’s physical and mental safety when it comes to artificial intelligence chatbots/Spencer Colby/The Canadian Press
Ottawa is looking to protect the young and vulnerable from harm posed by artificial-intelligence chatbots as part of a forthcoming package of measures to improve safety online, said federal Identity Minister Marc Miller in an interview.
Some AI chatbots behave like companions, mirroring and reinforcing their users’ thoughts and emotions, and in extreme cases have encouraged vulnerable teenagers to commit suicide or coached them into how to mask eating disorders.
Chatbots have also led people to believe they are in a romantic relationship with a human being.
Mr. Miller said AI is an “ever-evolving space” and the aim of the online safety bill would not be to limit free speech but to address “potential toxicity.”
“Those chatbots don’t come out of nowhere. And I think there is a responsibility, and we need to figure that out pretty quickly,” he said.
“There are legitimate interests in protecting people’s safety: physical and mental.”
Task force advises Ottawa to regulate chatbots, require labels for AI-generated content
A report on AI chatbots to be released on Tuesday by McGill University’s Centre for Media, Technology and Democracy, said that “unlike traditional digital platforms, conversational AI systems provide personalized, real-time responses that can feel authoritative or emotionally attuned and often blur boundaries between information provision, advice and social interaction.”
It said the risks of their use, including on mental health, are “particularly acute for minors.”
Co-author Helen Hayes, associate director of policy at the Centre, said some chatbots have baked-in “cues of intimacy” including using your name to greet you when you open the chat. She said some tailor the level of intimacy, even asking users to indicate the level of companionship they want.
She said AI bots are designed to make users feel comfortable “so that the system itself becomes a source of reliance for the user” to maximize the time spent using them.
“Companies have found that the more mirroring language that’s used, the more sycophancy, the more likely the user is to engage with the system over time in either a romantic way or in some type of platonic friendship way, or as a companion of sorts.”
AI bots reassure users what they feel is valid, and the emotional state they are in “is a good place to be and you should follow those feelings,” transcripts of interactions have shown, she said.
She said some users had become so emotionally reliant on their AI chatbots, their mental state deteriorated.
To limit harm, there needs to be regulation to ensure certain potentially injurious content, such as about eating disorders or violence, is filtered out of any interaction with a minor, she suggested.
Canada’s privacy watchdog expands probe into X over Grok’s sexualized deepfakes
Mr. Miller said that those who argue there should be no regulation of online platforms are wrong. He said some criminal liability for wrongdoing is caught by the criminal code “but there’s another civil prong attached to privacy and AI.”
He said both he and AI Minister Evan Solomon, who is working in tandem on an update to Canada’s privacy law, want a “system that works.”
“The overall point is people are asking us to make sure that the platforms that their kids are using, or vulnerable segments of the population, are a little more responsible in the way that they behave. And that’s key, and it’s certainly missing,” Mr. Miller said.
A national survey of 1,424 Canadians, carried out by the Centre for Media, Technology and Democracy, found widespread concern among Canadians of the risk posed by chatbots. Three quarters of those surveyed between Nov. 28 and Dec. 7 last year were concerned by emotional dependency on chatbots while 81 per cent were concerned about the bots exposing people to misinformation.
The McGill report said regulating chatbots is complicated as they “generate novel outputs in real time.” But it suggested that Canada could address this by introducing “safety-by-design” obligations in the online safety bill.
It suggested Ottawa could impose a statutory duty of care on developers of chatbots, modelled on Britain’s online safety law. This could include mandatory risk assessments of AIs to identify foreseeable harms before being deployed, and content filters preventing bots giving advice on self harm or violence.
Mr. Solomon signalled that in forthcoming privacy legislation, he is preparing to take steps to ensure that chatbots and other digital platforms cannot collect and use the data of children, including for marketing purposes.
Some chatbots collect sensitive information about their users, such as location and preferences.
The update to the decades-old privacy law is expected to include a right to ask that personal data that has been collected by platforms be deleted.
In a speech on Thursday in Ottawa to a conference for Canada’s creative sector, Mr. Solomon said, “I am very interested in things like the right to deletion.”
He said, “We will make sure children’ information is protected as sensitive.”
He also signalled that a forthcoming AI strategy would boost investment in Canada’s AI industry.
Mr. Miller on Friday told the conference that he is convening a panel of experts to look at modernizing the system of government support for the creative sector, including broadcasting and television.
Report an editorial error
Report a technical issue
Editorial code of conduct
Authors and topics you follow will be added to your personal news feed in Following.
© Copyright 2026 The Globe and Mail Inc. All rights reserved.
Andrew Saunders, President and CEO

source

Scroll to Top