Artificial intelligence chatbots are being used by a lot of people these days to do a lot of things but using them to plan a murder is not what most people would support.
Accordingly, Florida Attorney General James Uthmeier should be commended for investigating Open AI about its chatbot’s role in two Florida murder sprees by college students.
Hisham Abugharbieh, 26, is accused in the killings of two doctoral students at the University of South Florida, in Tampa. One of the victims was Abugharbieh’s roommate in an off-campus apartment complex near USF. The other was her girlfriend.
The two girls were from Bangladesh, but the media hasn’t reported where the suspect is from although it has noted he has a criminal record and mental problems.
News organizations cited records from a 2023 domestic-violence related incident involving Abugharbieh and family members in which relatives described increasingly erratic behavior and alleged delusional statements. He was reportedly detained under Florida’s Baker Act for emergency mental-health evaluation after that incident.
In addition, investigators are looking at AI’s role in murders by Phoenix Ikner, who was killed during a three-minute rampage in 2025.
Both murder suspects used ChatGPT shortly before the killings, asking questions obviously related to the subsequent crimes.
The AI queries Abugharbieh made included:
- “How can I dispose of a dead body?”
- questions about putting a body in trash bags and dumpsters,
- firearm licensing.
- “What happens if a human is put in a black garbage bag and thrown in a dumpster?”
According to court filings, ChatGPT replied that it “sounds dangerous,” after which prosecutors say he followed up with:
“How would they find out?”
OpenAI’s ChatGPT also provided advice to Ikner, who killed two people and shot five others last year at Florida State University.
Prosecutors have done an initial review of chat logs between ChatGPT and Ikner, to determine if the AI app aided, abetted or advised the commission of a crime.
Prosecutors think the chatbot advised Ikner on what type of gun and ammunition to use, whether a gun would be useful at short range, and what time of day and at which location would allow for the most potential victims, Uthmeier said.
In the context, the court document reveals many ChatGPT messages linked to the shooter’s motive. Ikner spoke with a chatbot on the day of the shooting, asking about possible media responses to a fictitious shooting scenario. He asked if three casualties would receive enough media attention, asked when the previous school shooting occurred, and wondered how the nation would react if there was a mass shooting at FSU. Ikner also asked about what type of firearm to use, type of ammunition, and what time to enter campus, according to a review of Ikner’s chat logs.
OpenAI has responded that the chatbot merely collected and provided the men information widely available on the internet.
While true, the question likely will be whether OpenAI had a duty to program chatbots to report questions that suggest someone is contemplating the commission of a crime. That probably would result in push back by civil libertarians.
It is another can of worms caused by the burgeoning use of AI technology in our everyday lives.







