Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
AI Minister Evan Solomon said he is working with other ministers on legislative options, including potentially regulating AI chatbot use by children.Justin Tang/The Canadian Press
Artificial Intelligence Minister Evan Solomon has summoned OpenAI’s top safety representatives to Ottawa after it was revealed that the tech giant had flagged, but not reported to authorities, worrying interactions between the Tumbler Ridge shooter and its ChatGPT chatbot months before the deadly attack.
Mr. Solomon said his team met with OpenAI representatives on Sunday and that he will sit down with the company’s safety chiefs in Ottawa on Tuesday.
“We will have a sit-down meeting to have an explanation of their safety protocols and when they escalate, and their thresholds of escalation to police, so we have a better understanding of what’s happening and what they do,” Mr. Solomon told reporters in Ottawa on Monday.
OpenAI confirmed Friday that the shooter’s account was banned last June for violating the company’s usage policy, but said that her activity did not meet the company’s threshold for notifying law enforcement. A user’s messages to the chatbot would have to indicate an “imminent and credible risk of serious physical harm to others” for that threshold to be met, OpenAI said in a statement.
The shooter killed her mother and half-brother at the home they shared before heading to Tumbler Ridge Secondary School, where she killed five students and a teacher’s aide. The shooter, whom RCMP have identified as Jesse Van Rootselaar, then killed herself at the school as police responded to the scene.
The shooting and OpenAI’s decision not to reveal the earlier warnings come as governments around the world grapple with how to regulate the fast-developing technology.
OpenAI did not mention Tumbler Ridge shooter’s posts in meeting with B.C. officials day after mass shooting: province
Mr. Solomon said he is working closely with other ministers on legislative options, such as potentially regulating AI chatbot use by children. He is expected to publish a government AI strategy soon.
He added that he is working with Canadian Identity Minister Marc Miller, whose department is preparing an online harms bill expected later this year, as well as Justice Minister Sean Fraser and Public Safety Minister Gary Anandasangaree.
The Wall Street Journal reported Friday that, while using ChatGPT in June, the shooter “described scenarios involving gun violence over several days,” which were flagged by an automated review system. About a dozen employees debated taking action, with some interpreting the writings as an indication of potential for real-world violence and urging leaders to alert Canadian law enforcement, the WSJ reported. The company ultimately did not contact authorities.
“From the outside, it looks like OpenAI had the opportunity to prevent this tragedy, to prevent this horrific loss of life, to prevent there from being dead children in British Columbia,” B.C. Premier David Eby told reporters on Monday. “I’m angry about that.”
On Feb. 11, the day after the shooting, OpenAI officials had a previously planned meeting with a B.C. government representative about the possibility of opening a satellite office in Canada. The company did not disclose at that meeting that it had potential evidence regarding the shootings in Tumbler Ridge, the B.C. government said.
Online harms bill needs framework for reporting threats in AI chats, experts say
On Feb. 12, OpenAI requested contact information for the RCMP.
In a statement to The Globe and Mail on Saturday, OpenAI said as soon as it became aware of the shooter’s identity in media reports on Feb. 11, it reached out to the U.S. Federal Bureau of Investigation to pass along information to the RCMP. This is how the company has handled such cross-border law enforcement communications about its users in the past, it said.
The FBI declined to comment on the matter, while the RCMP confirmed OpenAI reached out after the shooting.
“As part of the investigation, digital and physical evidence is being collected, prioritized, and methodically processed,” RCMP Staff Sergeant Kris Clark said in an email to The Globe and Mail on Monday. “This includes a thorough review of the content on electronic devices, as well as social media and online activities.”
In a statement Monday, OpenAI said it is supporting Mounties in their work, and that its senior leaders will discuss at Tuesday’s meeting in Ottawa the company’s “overall approach to safety, safeguards we have in place, and how we continuously work to strengthen them.”
With reports from Joe Castaldo and The Canadian Press
Report an editorial error
Report a technical issue
Editorial code of conduct
Authors and topics you follow will be added to your personal news feed in Following.
© Copyright 2026 The Globe and Mail Inc. All rights reserved.
Andrew Saunders, President and CEO