Listen to this article
Estimated 3 minutes
The audio version of this article is generated by AI-based technology. Mispronunciations can occur. We are working with our partners to continually review and improve the results.
The families of the victims of one of the worst mass shootings in Canadian history are taking OpenAI to court in California “to pursue landmark damage awards,” according to firm Rice Parsons Leoni & Elliott.
In a news release from the firm, lawyers say the Tumbler Ridge, B.C., shooter’s ChatGPT account was banned for “disturbing content,” which allegedly included planning violent scenarios, prior to the February tragedy.
“However, despite some 12 different OpenAI employees imploring the company to notify Canadian law enforcement about the Shooter’s plans, nothing else was done,” the firm said.
On Feb. 10, 18-year-old Jesse Van Rootselaar shot and killed her mother and half-brother at home before fatally shooting five children and an educator at the local secondary school, as well as injuring numerous others. She died of a self-inflicted injury.
The firm shared documents Wednesday from seven lawsuits filed by the victims of the school shooting, including the families of the six victims who died there and a seriously injured survivor.
The firm said litigating the cases in Canada would be challenging, and damages for pain and suffering are capped at about $470,000.
Instead, the victims’ families will bring their cases against OpenAI in California “to pursue landmark damage awards.”
The firm noted a lawsuit filed in B.C. by the family of Maya Gebala, who was seriously injured in the shooting, has been discontinued.
OpenAI responded to news of the lawsuits in a statement to CBC News by saying it has a “zero-tolerance policy for using our tools to assist in committing violence.”
“As we shared with Canadian officials, we have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources, strengthening how we assess and escalate potential threats of violence, and improving detection of repeat policy violators,” the statement said.
OpenAI CEO Sam Altman wrote an apology letter to the community, shared last week, but Gebala’s mother Cia Edmonds said she isn’t accepting it.
In a statement shared by her lawyers, Edmonds questioned whether Altman used ChatGPT to draft the apology.
“It is empty, soulless, and lacks any human warmth. Only a machine could have put those words together and called it an apology,” Edmonds wrote.
She asked why Altman didn’t contact Canadian authorities to advise them of concerns about the shooter.
John Rice, lead Canadian counsel for the victims, said the Tumbler Ridge tragedy was avoidable.
“Based on what we understand the Shooter to have discussed with ChatGPT, this murderous rampage was specific, predictable, and preventable — and OpenAI had the chance to stop it,” he said.
Rice said the victims’ families are seeking justice and have faith in their American neighbours.
“They want OpenAI’s conduct assessed in the same jurisdiction it calls home — the Northern District of California…. Never again should another AI-predicted and facilitated mass-shooting occur. Full stop.”
More to come.

