By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Today in CanadaToday in CanadaToday in Canada
Notification Show More
Font ResizerAa
  • Home
  • News
  • Lifestyle
  • Things To Do
  • Entertainment
  • Health
  • Tech
  • Travel
  • Press Release
  • Spotlight
Reading: Why the families of Tumbler Ridge shooting victims may face ‘difficult’ issues with OpenAI lawsuits
Share
Today in CanadaToday in Canada
Font ResizerAa
  • News
  • Things To Do
  • Lifestyle
  • Entertainment
  • Health
  • Travel
Search
  • Home
  • News
  • Lifestyle
  • Things To Do
  • Entertainment
  • Health
  • Tech
  • Travel
  • Press Release
  • Spotlight
Have an existing account? Sign In
Follow US
Today in Canada > News > Why the families of Tumbler Ridge shooting victims may face ‘difficult’ issues with OpenAI lawsuits
News

Why the families of Tumbler Ridge shooting victims may face ‘difficult’ issues with OpenAI lawsuits

Press Room
Last updated: 2026/05/02 at 5:39 AM
Press Room Published May 2, 2026
Share
Why the families of Tumbler Ridge shooting victims may face ‘difficult’ issues with OpenAI lawsuits
SHARE

The families of victims of the Tumbler Ridge, B.C., school shooting who are suing OpenAI could face some significant legal hurdles in their attempt to hold the artificial intelligence company partially responsible for the attack.

“As with so much in AI, the lawsuit takes us into unchartered territory,” said Robin Feldman, director of the AI Law & Innovation Institute at UC Law San Francisco.

Feldman said there a number of legal issues that the court will have to grapple with that will be “difficult for the plaintiffs,” who allege OpenAI failed to warn police about the shooter’s interactions with the company’s chatbot ChatGPT.

Those issues include whether OpenAI had a “duty to act” and contact law enforcement and whether that failure to act caused the attack, she said.

The case highlights concerns about the obligations the tech industry has to control and monitor chatbots or notify authorities about planned potential violence by chatbot users.

On Feb. 10, 18-year-old Jesse Van Rootselaar shot and killed her mother and half-brother at home before fatally shooting five children and an educator at the local secondary school, as well as injuring numerous others. She died of a self-inflicted injury.

According to seven lawsuits filed in U.S. federal court in San Francisco, the attack was “an entirely foreseeable result of deliberate design choices OpenAI made with full knowledge of where those choices led.”

‘Conscious decision not to warn authorities’

“OpenAI knew the Shooter was planning the attack and, after a contentious internal debate, made the conscious decision not to warn authorities,” the lawsuits say.

The lawsuits claim that the shooter’s ChatGPT conversations, which included gun violence scenarios, had been flagged, and safety team members recommended contacting the police.

But the lawsuits allege that OpenAI leadership overruled the safety team and police were never called. The lawsuits allege that OpenAI could have and should have prevented the shooting.

Does OpenAI have a ‘special relationship’ with users?

Colin Doyle, an associate professor of law at LMU Loyola Law School in Los Angeles, said what makes this case so unique is that among the other lawsuits that have been filed against OpenAI and other generative AI platforms, this is the first focusing on a “failure to warn.”

Doyle said that under California tort law, in general, people do not have a legal duty to control the actions of others — that there is no so-called Good Samaritan law obligation to act.

WATCH | Families of victims file lawsuit:

Families of Tumbler Ridge shooting victims file lawsuit against OpenAI

Families of the mass shooting victims in Tumbler Ridge, B.C., have filed a lawsuit in California against OpenAI. The claim alleges that the creators of ChatGPT failed to notify law enforcement about the shooter’s account activity prior to the attack that killed eight people, including six children.

However, the tort system does impose liability under this idea of a “special relationship,” he said. For example if a psychiatrist has determined their patient is a viable credible threat, they would have such a duty to warn authorities, Doyle said.

“Now, the question in this context is, does OpenAI have that special relationship?”

In this case, Doyle said it’s not the direct actions of OpenAI and ChatGPT that caused the deaths, but the actions of a third party. He suggested it could be like trying to blame the car manufacturer if an individual ran over another individual with a vehicle.

“Generally, our legal system disfavours having a company be responsible for what are those independent actions of others.”

Another key question will be whether Section 230 of the Communications Decency Act — a law that shields tech companies from liability for content that their users post — applies in this case, Feldman said.

Under Section 230, platforms are considered bulletin boards or publishing houses that are not liable for the content that users post.

Police tape in front of a brick school.
Tumbler Ridge Secondary School is pictured the day after a school shooting on Feb. 11, 2026. (Ben Nelms/CBC)

“Is ChatGPT like a bulletin board or publisher, or is ChatGPT like a facilitator who helped the crime?” she said.

Sharon Bauer, a Toronto-based privacy lawyer and AI governance specialist, said ChatGPT is different than other social media forums like Facebook or X, which are used for postings.

She said it’s also different from Google, which is “a passive index,” simply providing a user with what is already out there on the internet that anybody can see.

“It is not asking follow-up questions. There’s no express encouragement,” Bauer said.

“What [ChatGPT] does is it has a conversation with you, it has a dialogue.”

WATCH | Premier says OpenAI apology not enough:

OpenAI apologizes to Tumbler Ridge, B.C.; premier says it’s necessary but insufficient

OpenAI CEO Sam Altman sent an apology letter to Tumbler Ridge, B.C., after his company failed to alert law enforcement about an account that belonged to the Tumbler Ridge shooter. The shooter killed eight people, including six children.

It’s unclear what exactly ChatGPT said to the shooter, Bauer said. “And whether it said, ‘Sure, you can do this. Have you thought about that? Ask me more questions. Let me know about this, and then I can give you more information about that.'”

That raises the issue that even if OpenAI had a duty to warn, did OpenAI’s failure to act cause the crime, Feldman said.

“Can you say that ChatGPT caused the crime to happen, or are their queries and responses simply too remote to what happened?” she said.

Likely to argue ChatGPT ‘defective product’

To that end, Feldman said the families are likely to argue that ChatGPT is a defective product without appropriate safeguards.

“The question is, is ChatGPT a defective product or merely a product that was used improperly?” she said.

“And is it analogous to a product at all? It’s not a widget that we’re using or a car where the brakes failed. It’s more of a service.”

WATCH | Mayor discusses shooting investigation with premier:

Tumbler Ridge mayor discusses shooting investigation with Premier Eby

The mayor of Tumbler Ridge is in Victoria this week speaking with the premier and other provincial officials more than two months after the mass shooting that killed eight people.

On that point — and the big challenge for the plaintiffs — is that they will have to show a reasonable alternative design, Doyle said.

“I just can’t imagine how difficult this is when it comes to a generative AI platform,” he said.

“I would anticipate the challenge that the plaintiffs have in those instances is finding ways to prove an alternative design that has safety features without basically neutering the product.”

Quick Link

  • Stars
  • Screen
  • Culture
  • Media
  • Videos
Share This Article
Facebook Twitter Email Print
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

You Might Also Like

Who won the lottery? Why you may never know the full name of some jackpot winners going forward
News

Who won the lottery? Why you may never know the full name of some jackpot winners going forward

May 2, 2026
A robot fried this rice? Okanagan restaurant embraces AI-powered help in the kitchen
News

A robot fried this rice? Okanagan restaurant embraces AI-powered help in the kitchen

May 2, 2026
Elections Alberta alerted to improper use of voters’ information in late March, journalist says
News

Elections Alberta alerted to improper use of voters’ information in late March, journalist says

May 1, 2026
2 Durham cops will stand trial in 401 crash that killed baby, grandparents
News

2 Durham cops will stand trial in 401 crash that killed baby, grandparents

May 1, 2026
© 2023 Today in Canada. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact
Welcome Back!

Sign in to your account

Lost your password?