
A Florida woman weaponized artificial intelligence to fabricate evidence of a sexual assault, creating a ChatGPT-generated image of a fake attacker and filing false police reports that waste taxpayer resources and undermine legitimate victims.
Story Snapshot
- Brooke Schinault used ChatGPT to create fake attacker image days before reporting false assault
- St. Petersburg police discovered AI-generated evidence in deleted folder predating alleged crime
- Case represents dangerous new frontier of technology abuse against law enforcement
- Incident highlights urgent need for updated legal frameworks to combat AI evidence fraud
Digital Deception Exposed Through Police Investigation
Brooke Schinault, a 32-year-old St. Petersburg mother, reported on October 7, 2025, that a male stranger entered her home and sexually assaulted her. She provided police with what she claimed was a photograph of her attacker. However, investigators quickly determined the image was artificially generated using ChatGPT’s image creation tools. Police discovered the fabricated evidence in a deleted folder on her device, timestamped several days before the alleged assault occurred.
AI image was used in woman's false sex attack claim, cops say https://t.co/6TxGZkKNIg Photo of supposed male attacker was actually created via ChatGPT pic.twitter.com/qkmZOEetXZ
— The Smoking Gun (@tsgnews) October 16, 2025
Taxpayer Resources Wasted on Fabricated Crime
The false report triggered a full police investigation, diverting precious law enforcement resources from legitimate crimes. St. Petersburg police officers responded to the scene, collected evidence, and launched investigative procedures based on Schinault’s manufactured claims. This misuse of emergency services demonstrates how AI technology can be weaponized to waste taxpayer-funded resources while potentially delaying response to actual emergencies and real victims requiring assistance.
Growing Threat to Justice System Integrity
This case represents the first documented instance of AI-generated imagery being used to support false sexual assault allegations, creating a concerning precedent. Law enforcement agencies nationwide already struggle with limited tools and legal frameworks to identify AI-generated content. The incident exposes vulnerabilities in evidence authentication processes and highlights how easily individuals can now create convincing fake evidence. This technological abuse threatens the integrity of criminal investigations and judicial proceedings across America.
Legal Consequences and Broader Implications
Schinault was arrested for filing a false police report and released on $1,000 bond after spending one night in jail. Her motive remains unclear in official documents, though social media posts reference past abuse experiences. The case underscores the urgent need for legislative updates addressing AI-generated evidence fraud. Many states lack adequate legal frameworks to prosecute such technological misuse, creating enforcement gaps that criminals can exploit to manipulate the justice system.
False accusations damage the credibility of legitimate sexual assault victims while wasting resources meant to protect citizens. This abuse of AI technology represents a dangerous evolution in evidence fabrication that threatens constitutional principles of justice and due process. Law enforcement must develop new forensic capabilities to identify AI-generated content before more criminals exploit these technological vulnerabilities.
Brooke Schinault told police a man broke into her home, knocked her down & s*xually assaulted her.
But, she was able to take his photo.
Turned out, the photo was AI generated. Her report was a complete hoax.
The AI man then sued her for defamation. (That part is a joke.) pic.twitter.com/7SR14rMCLL
— Truth And Justice (@TruthAndJust1) October 17, 2025
Sources:
Surge in AI-generated child sexual abuse images alarms advocacy groups and investigators
AI-generated child sexual abuse images Maine police cannot investigate
Cops: AI Image Used In False Sex Attack Claim


























