Florida to open criminal investigation into OpenAI over ChatGPT’s influence on alleged mass shooter | Florida

Florida’s attorney general will launch a criminal investigation into how tech company OpenAI and software tool ChatGPT could influence users’ threats to harm themselves or others, including whether they “provided significant advice” to a gunman accused of carrying out a mass shooting in the state last year.
The state’s attorney general, James Uthmeier, said at a news conference Tuesday that his office was expanding its review of OpenAI, that a “criminal investigation is warranted” and that the state had subpoenaed the $852 billion California-based technology firm.
“If it were a person on the other end of this screen, we would charge them with murder,” Uthmeier said during an event in Tampa.
Uthmeier, who was appointed by Florida governor Ron DeSantis earlier this month, announced an investigation into the artificial intelligence company over potential national security and safety concerns.
But the issuance of a subpoena to OpenAI is a marked escalation that comes after lawyers spoke out on behalf of the family of Robert Morales, one of two people killed in a shooting rampage at Florida State University’s Tallahassee campus last April that also injured six people.
The lawyers said they learned that the gunman was “in constant communication with ChatGPT” before the shooting and that the chatbot “may have given the shooter advice on how to commit these heinous crimes.”
Phoenix Ikner, who was 20 at the time of the shooting, allegedly communicated frequently with ChatGPT before the campus attack, allegedly asking for detailed information about the workings of the guns and ammunition, where to find the most students and how the country might respond.
Ikner is expected to go to trial in October on charges of first-degree murder and attempted first-degree murder in the shooting. He pleaded not guilty.
The lawsuit filed on behalf of the Morales family is among several claims filed against OpenAI and Google alleging that their AI chatbots played a role in encouraging people to take their own or others’ lives.
Uthmeier said at the press conference that a review of the communications revealed that “ChatGPT offered the shooter important advice before he committed such heinous crimes.”
He added that “the chatbot advises the shooter on what type of weapon he should use, which ammunition is used with which weapon, and whether the weapon will be useful at short range.”
“Just because it’s a chatbot in AI doesn’t mean there isn’t criminal liability,” Uthmeier said, adding that his office will “look at who knew what, designed what, or what they should have done.”
OpenAI spokesperson Kate Waters said: He said in a statement to NBC News:: “Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this horrific crime.
“In this case, ChatGPT provided factual answers to questions with information that is widely available in publicly available sources on the Internet and did not promote or encourage illegal or harmful activities.”
The company said it continues to cooperate with authorities and shared information with law enforcement following the identification of a ChatGPT account believed to be associated with the suspect.
The announcement of the investigation in Florida comes two days after the worst mass shooting in the United States in two years on Sunday in Shreveport, Louisiana, where eight children were killed and what authorities described as a violent domestic incident. Shamar Elkins, the father of seven of the children, was shot and killed by police after the attacker was identified.




