Tumbler Ridge, B.C. lawsuits filed in California court against OpenAI – Global News

Welcome to the forefront of conversational AI as we explore the fascinating world of AI chatbots in our dedicated blog series. Discover the latest advancements, applications, and strategies that propel the evolution of chatbot technology. From enhancing customer interactions to streamlining business processes, these articles delve into the innovative ways artificial intelligence is shaping the landscape of automated conversational agents. Whether you’re a business owner, developer, or simply intrigued by the future of interactive technology, join us on this journey to unravel the transformative power and endless possibilities of AI chatbots.
Instructions:
Want to discuss? Please read our Commenting Policy first.






7 comments
Sad how Carney used this tragedy to score cheap political points.
What about the RCMP? What about the doctors?
What about BC Health, RCMP, and Eby
AI companies, and tech companies in general, want the scalability of profits without the similar scalability of responsibility.
Let’s sue the teachers and doctors who turned the boy into a girl!
They still could have told the police and left it to the police to deal with. Instead they did nothing and people are now dead.
Let’s back up to before the incident……if HE was reported as being a risk….can you just imagine the uproar from the alphabet people claiming HIS rights had been violated and HE was being discriminated against? Open AI was in a no-win situation!
If you get Global News from Instagram or Facebook – that will be changing. Find out how you can still connect with us.
Seven families impacted by the Tumbler Ridge, B.C., shooting in February filed lawsuits against OpenAI and its founder, Sam Altman, in a San Francisco court on Wednesday.
Eight people were killed when 18-year-old Jesse Van Rootselaar opened fire at a school in Tumbler Ridge on Feb. 10.
A cross-border legal team is pursuing action against OpenAI and Altman. Seven lawsuits have been filed on behalf of five murder victims and two who were injured.
“These families from the Canadian north have come together and they’ve decided to pursue litigation in the United States on a scale that can hold these companies to account,” Vancouver-based lawyer John Rice with Rice Parsons Leoni & Elliott LLP, told Global News.
The claims, which have not been tested in court, state that in the weeks that followed the attack in Tumbler Ridge, “a sickening truth emerged: ChatGPT played a role in the mass shooting and OpenAI could have, and should have, prevented it.”
In February, OpenAI confirmed that an account connected with Van Rootelsar was identified the previous June and was subsequently banned for violating the usage policy.
The company considered referring the account to law enforcement but determined the account activity did not meet the threshold for referring a user to law enforcement because it did not involve an imminent and credible risk or planning of serious physical harm to others.
In a statement in February, RCMP Staff Sgt. Kris Clark said the platform did reach out to the RCMP after the shooting.
“Sam Altman and his leadership team knew what silence meant for the citizens of Tumbler Ridge,” the lawsuits state.
“They were focused on what disclosure meant for themselves. Warning the RCMP would set a precedent: OpenAI would be compelled to notify authorities every time its safety team identified a user planning real-world violence.
“Given the volume of chat-induced violence on ChatGPT, that would require a dedicated law-enforcement referral team tasked with reporting OpenAI’s own users to authorities. And the public would finally see what OpenAI was desperately trying to hide: that ChatGPT is not the safe, essential tool the company sells it as, but a product dangerous enough that its makers routinely identify its users as threats to human life.”
The suits make several claims of negligence, product liability and violation of California’s business and professional code.
Chicago lawyer Jay Edelson, with Edelson PC, and Vancouver lawyer Rice met with families in Tumbler Ridge before filing the suit in California, where OpenAI is based.
“We spent the last two days meeting with the victims of the Tumbler Ridge shooting,” Edelson said.
“It has been some of the most difficult days of our professional lives.”
The lawsuits claim OpenAI’s safety team urged leadership to notify the RCMP but for OpenAI, “this was a question of corporate survival.”
In a statement to Global News, a spokesperson for OpenAI said “the events in Tumbler Ridge are a tragedy” and stressed it has worked to increase safeguards.
“We have a zero-tolerance policy for using our tools to assist in committing violence,” the spokesperson wrote in an email. “As we shared with Canadian officials, we have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources, strengthening how we assess and escalate potential threats of violence, and improving detection of repeat policy violators.”
According to Reuters, OpenAI is laying the groundwork ​for an initial public offering that could value it up to US$1 trillion.
Edelson added that they will be asking the jury to “send a strong message to Open AI that it can’t make a decision to put profits over the lives of little kids and it’s hard to imagine that we won’t ask for at least a billion dollars.”
The lawsuits likened OpenAI’s decision not to notify RCMP to Ford’s decision in the 1970s to keep selling the Pinto after its engineers warned that the fuel tank design would cause people to burn to death in rear-end collisions.
“This tragedy was not just predictable, it was preventable,” Rice said.
The suits claim “company leaders overruled the safety team members,” “deactivated the Shooter’s account, and kept what they had seen to themselves.”
“When the story eventually broke, Altman and OpenAI lied. First they claimed to have ‘banned’ the Shooter’s account.”
However, the lawsuits claim that OpenAI does not ban users. “It only ‘deactivates’ them – a process that can be reversed within minutes by registering a new account. The Shooter did exactly that, and continued using ChatGPT to plan the attack.”
“When we see the chats, I am very convinced that you’re going to see that ChatGPT wasn’t just listening to the shooter but actively pushing the shooter into this mindset,” Edelson said.
The lawsuits claim that OpenAI already had “clear knowledge” that people were using its product to plan and prepare real-world violence.
They cite a case that happened in January 2025, when a man used ChatGPT for feedback on how to use explosives and evade surveillance before detonating a Tesla Cybertruck in front of the Trump International Hotel in Las Vegas.
They also use another example, a case from April 2025, in which a 20-year-old gunman carried out a mass shooting at Florida State University. Chat logs showed that the gunman had used ChatGPT “extensively” in the lead-up and during the attack and the gunman had asked questions about how to fire a shotgun, the legal fates of school shooters and when the student union would be busiest, according to the lawsuits.
The suits also reference an incident that happened in May 2025, in which a teen boy in Finland used ChatGPT for nearly four months to help prepare for an attack in which he stabbed three 14-year-old girls at his school, according to the lawsuits.
Finnish authorities reported that the boy had made hundreds of chatbot queries, including research into stabbing tactics, concealment of evidence and information on mass killings.
Some legal action was initially started in Canada, then severed to pursue litigation stateside.
“In terms of expressing society’s condemnation, deterring corporate malfeasance, a Canadian court can’t do that for a company of this size,” Rice explained.
The MP for Prince George – Peace River, Bob Zimmer, said on Wednesday that the families of Tumbler Ridge “deserved answers and deserved restitution.”
Evan Solomon, Canada’s Minister of Artificial Intelligence and Digital Innovation, said on Wednesday morning that the federal government is also doing its part to make sure citizens are protected.
“This is why the AI Safety Institute is currently, as I said, working with OpenAI to assess their safety protocols and as I said, all options are on the table when we come back on that,” he said.
Solomon added that he and Marc Miller, Canada’s Minister of Canadian Identity and Culture, are looking at all options, including how to regulate online harm.
“We have to understand exactly what these companies are doing with these new technologies on AI chatbots,” he said.
“And we are looking very closely. We know there’s been a series of legislations that did not pass.”
Solomon said that he will be tabling legislation on privacy very soon, but did not provide specifics.
On April 24, Altman issued an apology letter to Tumbler Ridge, saying he is “deeply sorry that we did not alert law enforcement to the account that was banned in June.
“While I know words can never be enough, I believe an apology is necessary to recognize the harm and (irreversible) loss your community has suffered.”
Twelve-year-old Maya Gebala was shot three times at close range in the library of Tumbler Ridge Secondary School.
She has been fighting for her life in the hospital ever since.
In response to that apology letter, Gebala’s mother, Cia, released a statement saying in part, “to think, a simple phone call could have prevented this.”
“Tumbler Ridge sees your ‘apology’, Sam. We do not accept it.”
All seven lawsuits request jury trials, which the legal team expects will move forward next year.
Stick to the Facts
Add Global News as a Preferred Source on Google to see more of our stories in your search results.
The email you need for the day’s top news stories from Canada and around the world.
The email you need for the day’s top news stories from Canada and around the world.

source

Scroll to Top