Loading...
Please wait for a bit
Please wait for a bit

Click any word to translate
Original article by Leyland Cecco in Toronto
The family of a child critically injured one of Canada’s worst mass shootings is suing OpenAI, arguing the technology company could have prevented the attack on a school last month.
The lawsuit comes days after the head of OpenAI said he would apologize to the families of a remote Canadian town after violence shattered the tight-knit community.
Eight people – including five school students, aged 12 to 13, and a 39-year-old teaching assistant – were killed by an 18-year-old shooter in the mountain town of Tumbler Ridge on 10 February.
It later emerged that the shooter, Jesse Van Rootselaar, who died of a self-inflicted injury, had described violent scenarios involving guns to ChatGPT over several days in June, which an automated review system flagged, according to the Wall Street Journal.
But OpenAI, which owns the chatbot, said it felt the account activity did not identify “credible or imminent planning” and so banned Van Rootselaar’s account, but did not notify authorities in Canada. The company later said it found a second account linked to the shooter after the first was suspended.
On Monday, Cia Edmonds filed a lawsuit against the company on behalf of herself and her two daughters, Maya and Dahlia Gebala, both of whom were present during the shooting.
“The purpose of this lawsuit is to learn the whole truth about how and why the Tumbler Ridge mass shooting happened, to impose accountability, to seek redress for harms and losses, and to help prevent another mass-shooting atrocity in Canada,” the law firm Rice Parsons Leoni & Elliott LLP, which is representing the family, said in a statement.
The allegations have not been tested in court.
Maya, 12, was shot three times. One bullet entered her head above her left eye and another hit her neck. A third bullet grazed her cheek and part of her ear, the lawsuit says.
She remains in hospital after suffering a catastrophic traumatic brain injury, permanent cognitive and physical disability, right-sided hemiplegia, scarring and physical deformities, according to the claim.
Both Edmonds and her daughter Dahlia, who was not injured physically in the shooting, have experienced PTSD, anxiety, depression and sleep disturbances.
Edmonds’ civil claim alleges ChatGPT was rushed to market by OpenAI without adequate safety studies. The family is seeking undisclosed punitive damages, saying the company’s conduct “is reprehensible and morally repugnant” to both the plaintiffs and the “community at large”.
Last week, OpenAI’s CEO, Sam Altman, met virtually with the British Columbia premier, David Eby, and Darryl Krakowka, the mayor of Tumbler Ridge, amid mounting frustration that the tech giant’s existing policies did not require it to report violent content to police.
“Everybody on the call recognized that an apology is nowhere near sufficient, but also that it is completely necessary,” Eby said. “And the mayor of Tumbler Ridge is going to work with OpenAI to make sure that any public statements relating to that are done in the way that is appropriate and meaningful, as much as possible, [and] doesn’t retraumatize people in the community.”
Asked to comment on the lawsuit, a spokesperson for the company called the shooting an “unspeakable tragedy” and said Altman will work with Eby and Krakowka “to find the best way to convey his apology and support to the Tumbler Ridge community” but did not give a timeline.
“OpenAI remains committed to working with provincial and local officials to make meaningful changes that help prevent tragedies like this in the future.”
The company did not say if the lawsuit would change Altman’s plans to apologize.
“OpenAI had the opportunity to notify authorities and potentially even to stop this tragedy from happening,” Eby told reporters after the meeting with Altman. The premier said while the company could have done more, he flagged a lack of mental health support and the shooter’s access to firearms.
Eby, who delivered an emotional speech to the community at a vigil in the days after the shooting, has emerged as a staunch critic of the largely nonexistent regulatory framework governing how artificial intelligence companies operate in Canada – and how OpenAI handled the situation.
“It’s not acceptable that it’s up to the companies about whether or not to report, and that needs to change.”
Eby refused meetings with members of the company’s leadership team, demanding instead that he speak directly with Altman. In the 30-minute call, the premier said he did not ask about interactions between the shooter and OpenAI’s chatbot.
Already under pressure from lawmakers, the company has changed how it works to better identify potential warning signals of serious violence. Canada’s AI minister, Evan Solomon, said he had asked the company to apply the new safety standards retroactively and review previously flagged cases.
“This will determine whether additional incidents that would have been referred to law enforcement under OpenAI’s new safety standards were missed, and ensure they are promptly reported to the RCMP,” Solomon said.
While Eby said OpenAI’s leadership has been “responsive” to the concerns of governments, he warned that other companies with similar chatbots hadn’t yet changed their policies.
“The status quo doesn’t work, didn’t work, and it very much presents the threat that it might fail again,” said Eby. “And so change needs to be made quite urgently.”