A Wisconsin software engineer was arrested on Monday for allegedly creating and distributing thousands of images of artificial intelligence-generated child sexual abuse material (CSAM).
Court documents describe Steven Andreiger as “extremely tech-savvy” with a background in computer science and “decades of software engineering experience.” Anderegg, 42, is accused of sending artificial intelligence-generated nude images of minors to a 15-year-old boy via Instagram DMs. Anderegg came to the attention of law enforcement after the National Center for Missing and Exploited Children flagged the messages, which he allegedly sent in October 2023.
According to information obtained by law enforcement from Instagram, Anderegg posted an Instagram Story in 2023 “consisting of actual GenAI images of minors wearing BDSM-themed leather suits” and encouraged others to “check out” on Telegram if they had missed it What. Andreger allegedly “discussed his desire to have sex with prepubescent boys” in private messages with other Instagram users and told one Instagram user that there were “tons” of other AI-generated messages on his Telegram CSAM images.
Andreger allegedly started sending the photos to another Instagram user after learning he was only 15 years old. “When the minor informed him of his age, the defendant did not refuse him or inquire further. Instead, he wasted no time describing to the minor how he created sexually explicit GenAI images and sent customized content to the child,” the complaint states. the document states.
Prosecutors said that when law enforcement searched Andreiger’s computer, they found more than 13,000 images, “hundreds if not thousands of which depicted nude or semi-nude prepubescent minors.” Charging documents say Anderegg produced the images on the text-to-image model Stable Diffusion, a product created by Stability AI, and used “extremely specific and explicit prompts to create the images.” Anderegg also allegedly used “negative cues” to avoid creating images depicting adults, and used third-party stabilization plugins that “specialize in genitalia.”
Last month, several major tech companies, including Google, Meta, OpenAI, Microsoft and Amazon, said they would review CSAM’s artificial intelligence training materials. The companies committed to a new set of principles, including a “stress testing” model, to ensure they did not create CSAM. Stability AI has also signed up to these principles.
According to prosecutors, this is not the first time Anderegg has been contacted by law enforcement for allegedly possessing CSAM via a peer-to-peer network. Prosecutors said that in 2020, someone used the Internet at Anderegg’s home in Wisconsin to attempt to download multiple known CSAM files. When law enforcement raided his home in 2020, Anderegg admitted to running a peer-to-peer network on his computer and frequently resetting his modem, but he was not charged.
In a brief supporting Andregg’s pretrial detention, the government noted that he had worked as a software engineer for more than 20 years and that his resume included a recent job at a startup where he leveraged his “ability to Excellent technical understanding for formulating artificial intelligence models”.
If convicted, Anderegg faces up to 70 years in prison, but prosecutors said “the recommended sentencing range could be as high as life in prison.”