A U.S. Army soldier has been charged by federal officials with using artificial intelligence to make explicit sexual images of children he knew. This is part of the government’s larger effort to punish people who use AI to make images of child sexual abuse.
A statement from the Justice Department on Monday said that Seth Herrera, 34, an Army soldier stationed in Anchorage, had thousands of pictures of children being sexually abused violently and relied on AI tools to make realistic child sex abuse material. He was caught last week and went to court for the first time on Tuesday.
Documents in court say Herrera took pictures of children he knew and put them through AI software that undressed them or put them on pornographic images of them having oral sex or being penetrated by a sexual object. He saved and got pictures of child sex abuse through famous messaging apps like Telegram.
At the same time that Herrera was arrested, the internet is full of AI-generated child sexual abuse material (CSAM), which is also known as child pornography. This is made possible by software that makes fake pictures. Child safety experts told The Washington Post that the tools are being pushed more and more on pedophile forums as a way to make sexual images of children that are not edited and look very real.
At the same time, federal officials are making court cases that say images created by AI should be dealt with in the same way as real-life recordings of child sex abuse.
“The wrong use of cutting-edge generative AI is speeding up the spread of harmful content,” Lisa Monaco, deputy attorney general, said in a statement from the Justice Department. “…criminals who are thinking about using AI to keep doing bad things should stop and think twice.”
The Army was asked to answer questions by the Defense Department. When asked for a comment, the Army did not answer right away. Assistant federal public defender Benjamin Muse, who is Herrera’s lawyer, refused to say anything.
There have been a lot of federal cases recently about AI and child abuse material, and Herrera was arrested. In May, a man from Wisconsin was charged with using AI to make pictures of child sex abuse. This was likely the first federal charge of making such material that included pictures made totally by AI.
In two other cases recently, federal officials said men in North Carolina and Pennsylvania had used AI to either digitally remove children’s clothes from real photos or put their faces into explicit sex scenes. This is called a “deepfake.”
Homeland Security Investigations carried out a search warrant and found Herrera’s three Samsung Galaxy phones. They showed that he had “tens of thousands” of videos and pictures from March 2021 showing children as young as babies being raped violently, according to court documents.
Herrera sent explicit content through more than just Telegram. He also used Potato Chat, Enigma, and Nandbox, which are all chat apps. Court papers say he also made his own public Telegram group to store his sexy messages.
Court papers say Herrera “morphed” sexual abuse images by taking pictures and videos of kids he knew in private moments, like while they were taking a shower. authorities said he would zoom in on these pictures and use AI to “improve” them. Officials say that when those pictures “did not satisfy his sexual desire,” Herrera turned to AI to draw young people doing “the type of sexual conduct he wanted to see.”
Robert Hammer, the special agent in charge of Homeland Security Investigations’ Pacific Northwest Division, said that Herrera’s use of AI to create images of child sexual abuse while he was a soldier was a “deep breach of trust” and showed the problems that police will face in defending children.
Based on what Stars and Stripes says, Herrera is an enlisted Army specialist who worked as a motor transport operator in the 11th Airborne Division at Joint Base Elmendorf-Richardson in Anchorage.
One count against him is transporting images of child sexual abuse, one count is getting them, and the last count is having them. Herrera could spend up to 20 years in jail if he is found guilty.