GIZMODO.COM
Alaska Man Reported Someone for AI CSAM, Then Got Arrested for the Same Thing
By Thomas Maxwell Published December 27, 2024 | Comments (0) | Law enforcement arrests Alaska man for AI generated child porn. Anadolu / Getty If you are going to contact the police and rat on someone for expressing their interest in child sexual abuse material (CSAM) to you, maybe it is not the best idea to have the same material on your own devices. Or to further consent to a search so law enforcement can gather more information. But that is allegedly what one Alaska man did. It landed him in police custody. 404 Media reported earlier this week on the man, Anthaney OConnor, who ended up getting himself arrested after a police search of his devices allegedly revealed AI-generated child sexual abuse material (CSAM).From 404: According to newly filed charging documents, Anthaney OConnor, reached out to law enforcement in August to alert them to an unidentified airman who shared child sexual abuse (CSAM) material with OConnor. While investigating the crime, and with OConnors consent, federal authorities searched his phone for additional information. A review of the electronics revealed that OConnor allegedly offered to make virtual reality CSAM for the airman, according to the criminal complaint. According to police, the unidentified airman shared with OConnor an image he took of a child in a grocery store, and the two discussed how they could superimpose the minor into an explicit virtual reality world.Law enforcement claims to have found at least six explicit, AI-generated CSAM images on OConnors devices, which he said had been intentionally downloaded, along with several real ones that had been unintentionally mixed in. Through a search of OConnors home, law enforcement uncovered a computer along with multiple hard drives hidden in a vent of the home; a review of the computer allegedly revealed a 41-second video of child rape. In an interview with authorities, OConnor said he regularly reported CSAM to internet service providers but still was sexually gratified from the images and videos. It is unclear why he decided to report the airman to law enforcement. Maybe he had a guilty conscience or maybe he truly believed his AI CSAM didnt break the law.AI image generators are typically trained using real photos; meaning pictures of children generated by AI are fundamentally based on real images. There is no way to separate the two. AI-based CSAM is not a victimless crime in that sense. The first such arrest of someone for possessing AI-generated CSAM occurred just back in Maywhen the FBI arrested a man for using Stable Diffusion to create thousands of realistic images of prepubescent minors.Proponents of AI will say that it has always been possible to create explicit images of minors using Photoshop, but AI tools make it exponentially easier for anyone to do it. A recent report found that one in six Congresswomen have been targeted by AI-generated deepfake porn. Many products have guardrails to prevent the worst uses, similar to the way that printers do not allow photocopying of currency. Implementing hurdles at least prevents some of this behavior.Daily NewsletterYou May Also Like By Thomas Maxwell Published December 26, 2024 By Matt Novak Published December 26, 2024 By Thomas Maxwell Published December 23, 2024 By AJ Dellinger Published December 23, 2024 By Thomas Maxwell Published December 23, 2024 By Matt Novak Published December 23, 2024
0 Comentários 0 Compartilhamentos 56 Visualizações