Lawsuit Filed Against ChatGPT for Failing to Report Planned Crime on Platform

Technology|10/3/2026
Lawsuit Filed Against ChatGPT for Failing to Report Planned Crime on Platform
ChatGPT
Listen to this story:
0:00

Note: AI technology was used to generate this article's audio.

  • Parents of attack victim file lawsuit against ChatGPT
  • Perpetrator used platform to plan assault that left girl critically injured

The parents of a teenage girl critically injured in a school shooting in Canada have filed a civil lawsuit claiming that OpenAI, the developer of ChatGPT, knew the perpetrator was using the platform to plan the attack.

According to the lawsuit, Jessie Van Roostsilar carried out the shooting at Tumbler Ridge in British Columbia on February 10, killing eight people before taking her own life. The attack left the victim, Maya Gibbala, with three close-range gunshot wounds to her head, neck, and cheek, causing catastrophic brain injuries and permanent physical and cognitive impairments.

OpenAI had previously considered notifying authorities about the individual’s behavior but did not follow through. The company informed police that the perpetrator’s ChatGPT account had been deactivated, but the lawsuit claims she simply used another account to bypass restrictions.

The complaint, filed in the Supreme Court of British Columbia, states that OpenAI “had specific knowledge that the perpetrator was using ChatGPT to plan a mass-casualty event like the Tumbler Ridge shooting.”

It adds that the perpetrator relied on the AI chatbot as a trusted source and “partner in planning,” which allegedly enabled her to organize deadly actions.