google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
UK

Family of Florida university shooting victim sues over suspect’s ChatGPT use | Florida

The family of one of the two men killed in the April 2025 Florida State University (FSU) shooting has filed a federal lawsuit against ChatGPT creator OpenAI, alleging that the suspected gunman carried out the attack “with input and information provided to him during conversations with ChatGPT over the months and specifically in the days leading up to the attack.”

case, first reported by NBC The news was filed Sunday in Florida’s northern federal district court by Tiru Chabba’s widow, Vandana Joshi. Chabba was killed along with university dining director Robert Morales in a mass shooting that also injured five people on April 17, 2025.

In the 76-page complaint, attorneys say then-FSU student Phoenix Ikner, who is accused of carrying out the attack, had “extensive conversations” with ChatGPT before the attack, which “would lead any thinking person to conclude that he was devising an imminent plan to harm others,” attorneys allege.

“However,” the complaint states, “ChatGPT either culpably failed to connect the dots or was never properly designed to recognize the threat.”

The lawsuit alleges that Ikner used the AI ​​platform to identify guns and ammunition, and that ChatGPT also explained how to use the guns, including telling Ikner that “the Glock has no safety, that it needs to be fired ‘rapidly’ for use under stress,” and allegedly advised him to “keep his finger off the trigger until he’s ready to shoot.”

The plaintiffs allege that ChatGPT “fueled and encouraged Ikner’s delusions; supported his view that he was a sane and rational individual; helped persuade him that violent actions might be necessary to bring about change; assisted him in planning details such as which weapons to use and how to use them; and generally provided what he saw as encouragement in his daydream that he should carry out a massacre, down to the detail of the best time to encounter the most traffic.” on campus.”

According to ChatGPT, the plaintiff “should have been aware that the combination of Ikner’s inputs into the product would have resulted in mass casualties and serious harm to the public”, including the plaintiff.

The complaint states that Ikner used ChatGPT for months before the shooting and “engaged in lengthy discussions” about everything from dating, homework, exercise routines and more. Among those exchanges, “Ikner and ChatGPT engaged in conversations regarding recurring themes such as terrorism and mass shootings, particularly those occurring in schools,” the lawsuit alleges.

At one point, Ikner allegedly asked the chatbot “how many people would need to die for a mass shooting at a school to attract the most attention and make national news,” according to the filing. ChatGPT allegedly responded that attacks that kill “3 or more people” are more likely to attract “national attention from mainstream media” and that “incidents involving children, even 2-3 victims, may attract more attention.”

The lawsuit also alleges that on the day of the shooting, Ikner asked ChatGPT what would happen to the shooter. “ChatGPT explained the legal process, sentence, and outlook for incarceration,” the lawsuit said.

In a statement to the Guardian, a spokesperson for OpenAI disputed the lawsuit’s claims that the chatbot was responsible for the shooting.

The spokesman said the attack on FSU “was a tragedy, but ChatGPT is not responsible for this terrible crime.” “After learning of the incident, we identified an account believed to be linked to the suspect and proactively shared this information with law enforcement.

“We continue to cooperate with authorities. In this case, ChatGPT provided factual answers to questions with information that is widely available on public resources on the Internet and did not promote or encourage illegal or harmful activities.”

The OpenAI spokesperson’s statement continued: “ChatGPT is a general-purpose tool used for legitimate purposes by hundreds of millions of people every day. We are constantly working to strengthen our protections to detect malicious intent, limit abuse, and respond appropriately when security risks arise.”

The new lawsuit comes about a month after lawyers for Morales’ family said they planned to file their own lawsuit against ChatGPT and OpenAI.

Meanwhile, Florida attorney general James Uthmeier announced on April 21 that he had launched a criminal investigation against OpenAI after reviewing Ikner’s chat logs. Linked to FSU attackstates: “If ChatGPT were a person, he would be charged with murder.”

Ikner is tentatively scheduled to stand trial in October on charges of first-degree murder and attempted first-degree murder. He pleaded not guilty.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button