The ChatGPT Warning He Ignored: Florida AI Accountability Test
When Hisham Abugharbieh asked ChatGPT how to dispose of a body in a black garbage bag, the AI system warned him it sounded dangerous. He replied: "How would they find out?" That exchange, documented in a charging affidavit and confirmed by the Florida attorney general's office, is now at the center of a criminal investigation that Florida Attorney General James Uthmeier expanded on Monday to include the murders of two University of South Florida doctoral students alongside a prior case against a separate suspect accused of a mass shooting at Florida State University.
The victims — Zamil Limon, 27, and Nahida Bristy, 27, both doctoral students from Bangladesh on student visas — were found dead on April 24 on the Howard Frankland Bridge spanning Tampa Bay, inside multiple black trash bags in advanced stages of decomposition. Charging documents say Abugharbieh, their roommate, bought duct tape from Amazon on April 7 and purchased fire starter, charcoal, trash bags and lighter fuel on April 11, before the students disappeared. He is charged with two counts of first-degree murder, unlawfully moving a dead body, failure to report a death, tampering with evidence, false imprisonment and battery.
According to charging documents reported by ABC News, Abugharbieh asked ChatGPT a series of operational questions in the days before the students vanished: what happens to a body placed in a black garbage bag; whether a vehicle identification number (VIN) can be changed; and whether cars are checked at Hillsborough River State Park. In the separate FSU case, Phoenix Ikner allegedly asked ChatGPT what ammunition to use, whether a gun would be useful at short range, and when the student union is busiest.
Uthmeier's office confirmed it had expanded its prior criminal investigation into OpenAI — opened April 21 over the FSU shooting — to include the USF case. "We are expanding our criminal investigation into OpenAI to include the USF murders after learning the primary suspect used ChatGPT," Uthmeier said in a statement. "If ChatGPT were a person, it would be facing charges for murder."
OpenAI said in a statement: "This is a terrible crime, and our thoughts are with everyone affected. We are looking into these reports and will do whatever we can to support law enforcement."
The legal theory Uthmeier is advancing has no clear precedent in U.S. courts. No American software company has been held criminally liable for how a user employed its product — a protection that has survived decades of case law even as computers became essential to countless crimes. Whether that protection extends to a generative AI system that warned its user before the crime is the open question Florida courts will now have to answer. The FL AG's office has not disclosed what specific evidence, beyond the ChatGPT logs, it has gathered to support the criminal theory.
What to watch next: whether prosecutors can produce evidence that ChatGPT's responses in the USF case went beyond what a basic search engine would have returned. That distinction — advice versus retrieval — may determine whether the investigation becomes a case or remains a press release.