
Topics: ChatGPT, Artificial intelligence, Crime, US News, Technology

Topics: ChatGPT, Artificial intelligence, Crime, US News, Technology
In April 2025, Tiru Chabba was named as one of the two victims of a mass shooting at Florida State University in Tallahassee.
This week, his widow, Vandana Joshi, filed a lawsuit against OpenAI, claiming an artificially intelligent chatbot had advised his killer beforehand on how to wreak maximum devastation.
Phoenix Ikner tragically gunned down both Chabba and university dining director Robert Morales, simultaneously wounding six others.
Weeks earlier, he'd been told by a ChatGPT bot in conversations that were later made public, which time of day would allow for the largest victim count, as well as which types of gun and ammunition were best to use.
Advert

According to NBC, during one discussion, the bot advised the shooter, who'd been an FSU student at the time, that 'if children are involved, even 2-3 victims can draw more attention'.
Ikner had reportedly shared images of firearms he'd required, after which the site told him 'the Glock had no safety, that it was meant to be fired "quick to use under stress" and advising him to keep his finger off the trigger until he was ready to shoot'.
The killer had also inquired about 'the legal process, sentencing, and incarceration outlook'.
Joshi is now suing ChatGPT's parent company, OpenAI, claiming the negligence of tech leaders had contributed to the tragedy.

"OpenAI knew this would happen," she said in a suit filed to federal court on Sunday (10 May). "It’s happened before, and it was only a matter of time before it happened again."
She also accused the AI firm of putting 'their profits over our safety', adding that it 'killed my husband'.
"They need to be responsible before another family has to go through this," the widow continued.
One of Joshi's legal representatives, Bakari Sellers, added: "The unique thing about this is we are not going to allow the American public to have a clinic run on them by OpenAI and ChatGPT."
Another attorney alleged: "ChatGPT inflamed and encouraged Ikner’s delusions; endorsed his view that he was a sane and rational individual; helped convince him that violent acts can be required to bring about change."

A spokesperson for OpenAI has denied wrongdoing, however.
Despite Drew Pusateri describing the double murder as a 'terrible crime', he insisted: "In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity.
"ChatGPT is a general-purpose tool used by hundreds of millions of people every day for legitimate purposes. We work continuously to strengthen our safeguards to detect harmful intent, limit misuse, and respond appropriately when safety risks arise.”
His comments come a month after Florida's attorney general announced a rare investigation into ChatGPT's alleged involvement in Ikner's crime.
James Uthmeier said in April: "If it was a person on the other end of that screen, we would be charging them with murder."
Ikner, 21, the son of a sheriff's deputy, pleaded not guilty.

He faces two counts of first-degree murder, as well as several counts of attempted first-degree murder.
Prosecutors are seeking the death penalty in the event that he is convicted in the coming weeks.
Tyla contacted OpenAI for further comment.