Families of Canada school shooting victims sue OpenAI over shooter's use of ChatGPT
The families of victims of a school shooting in a Canadian Rockies town are suing artificial intelligence company OpenAI in U.S. federal court, seeking to hold the ChatGPT maker responsible for failing to alert police to the shooter’s alarming interactions with the chatbot.
A lawsuit filed Wednesday on behalf of 12-year-old Maya Gebala, who was critically injured in the February shooting, is among the first of dozens of cases that families in Tumbler Ridge, British Columbia are planning with claims alleging wrongful death, negligence and product liability.
Plaintiffs' attorney Jay Edelson said in an interview that decisions made by OpenAI and its CEO Sam Altman “have destroyed the town. The people are really resilient, but what happened is unimaginable.”
Altman sent a letter last week formally apologizing to the community that his company did not notify law enforcement about the shooter's online behavior.
Authorities have said the shooter killed her mother and 11-year-old stepbrother in their home on Feb. 10 before opening fire at the nearby Tumbler Ridge Secondary School, killing five children and an educator before killing herself. Twenty-five people were also injured in the attack, Canada's deadliest mass shooting in years.
The case highlights concerns about the harms posed by overly agreeable AI chatbots and what obligations the tech industry has to control them or notify authorities about planned violence by chatbot users. This month, prosecutors investigating the deaths of two University of South Florida doctoral students said that the suspect asked ChatGPT about body disposal in the lead-up to the students’ disappearance.
In response to the lawsuit, OpenAI said in a written statement that the “events in Tumbler Ridge are a tragedy. We have a zero-tolerance policy for using our tools to assist in committing violence."
"As we shared with Canadian officials, we have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources, strengthening how we assess and escalate potential threats of violence, and improving detection of repeat policy violators,” the company said.
Edelson, a Chicago-based lawyer known for taking on the tech industry, is already juggling a number of high-profile cases against OpenAI, including from the family of a California teenager who killed himself after conversations with ChatGPT and another from the heirs of an 83-year-old Connecticut woman killed by her son after ChatGPT allegedly amplified the man's “paranoid delusions."
“This is not a passive technology,” said Edelson, comparing the chatbot interactions with a more conventional online search for information. “What we’ve seen in the past is that (for) people who are mentally ill, the chatbot will validate what they’re saying and then amplify what they’re saying.”
Last week, Edelson visited the small town of Tumbler Ridge and met with dozens of people in the basement of a visitor center. He also visited Gebala at a children's hospital in Vancouver, where she remains hospitalized and seemed alert but unable to speak.
“It was so heartbreaking,” he said.
The lawsuits filed Wednesday represent the families of the five slain children targeted in the school shooting: Zoey Benoit, Abel Mwansa Jr., Ticaria “Tiki” Lampert and Kylie Smith, all 12, and Ezekiel Schofield, 13, and the education assistant, Shannda Aviugana-Durand.
After the shootings, OpenAI came forward to say that last June the company had flagged the shooter's account had been used to discuss violence against other people.
The company said it considered whether to refer the account to the Royal Canadian Mounted Police but determined at the time that the account activity didn’t meet a threshold for referral to law enforcement. OpenAI banned the account in June for violating its usage policy.
The lawsuits filed Wednesday allege “the victims didn’t learn this because OpenAI was forthcoming, but because its own employees leaked it to The Wall Street Journal after they could no longer stomach the company’s silence.”
In his letter posted Friday, Altman said he was “deeply sorry that we did not alert law enforcement to the account that was banned in June.”
“While I know words can never be enough, I believe an apology is necessary to recognize the harm and irreversible loss your community has suffered,” Altman wrote.
British Columbia Premier David Eby, in a social media post, called the apology “necessary, and yet grossly insufficient for the devastation done to the families of Tumbler Ridge.”
The Gebala lawsuit accuses OpenAI of negligence involving a failure to warn law enforcement and “aiding and abetting a mass shooting.”
Along with damages, the Gebala lawsuit seeks a court order that would require OpenAI to ban users from ChatGPT if their accounts were deactivated for violent misuse, and to require the company to alert law enforcement when their systems identify someone who poses a “real-world risk of violence.”
An earlier case was filed in a court in British Columbia but a team of lawyers in both countries is seeking to bring the affiliated cased to San Francisco, where OpenAI is headquartered.
——- AP journalist Jim Morris contributed to this story from Vancouver, British Columbia.
© Copyright The Associated Press. All rights reserved. The information contained in this news report may not be published, broadcast or otherwise distributed without the prior written authority of The Associated Press.
