
The widow of a man killed in last year’s mass shooting at Florida State University is suing ChatGPT developer OpenAI, blaming the company’s spy chatbot for providing advice on how to carry out the attack.
The lawsuit comes after federal authorities revealed that ChatGPT provided information to the shooter about the time and location that would add victims to the campus, as well as the type of gun and ammunition to use. Authorities say he was also told that an attack could get more media attention if children were involved.
“OpenAI knew this would happen. It had happened before and it was only a matter of time before it happened again,” Vandana Joshi, whose husband Tiru Chabba was one of the two people killed, said in a statement on Monday. Six people were also injured.
The lawsuit, filed Sunday in federal court, says OpenAI should have built ChatGPT with security measures to notify a person that the police may need to investigate “to prevent a specific program of harm to the public”.
OpenAI has denied any wrongdoing in what it called a “horrific crime”.
“In this case, ChatGPT provided truthful answers to questions and information that could be widely found in public sources on the Internet, and did not encourage or promote illegal or dangerous activities,” Drew Pusateri, a spokesman for the company, said in an email to the Associated Press.





