A suspected individual has been accused of utilizing a GoPro camera to record children at Disney World, with the intention of generating a large number of artificial intelligence child abuse images for distribution on the dark web.
According to FBI sources, Justin Ryan Culmo has reportedly confessed to secretly recording his underage victims at an Orlando, Fla. theme park and at least one middle school in recent years, as reported by Forbes.
According to authorities, Culmo is accused of utilizing the AI model Stable Diffusion to transform the footage into lifelike images portraying child abuse. These distressing images were then distributed online using the screen names “Avalanche” and “TheRealAvalanche.”
Investigators were able to locate him after identifying one of his alleged victims and tracing the manipulated images back to him, according to federal authorities.
Upon apprehension, authorities discovered a significant collection of child abuse images on the individual’s devices, along with five concealed spy cameras stored in a securely locked drawer within his desk, as stated in the report.
A suspect, apprehended in recent months, has been charged with a range of child exploitation offenses. These include the mistreatment of his two daughters, surreptitiously recording minors, and disseminating child sexual abuse material on the internet, as reported by Forbes.
Culmo, who had been under the scrutiny of federal authorities since at least 2012, has not faced charges related to the production of AI child sexual abuse materials.
He has entered a plea of not guilty to the charges he is facing and is scheduled to go to trial next month.
Jim Cole, a former Department of Homeland Security agent who worked on the case, emphasized the alarming extent of exploitation made possible by AI in the wrong hands.
“This case starkly highlights the ruthless exploitation that AI can enable when wielded by someone with the intent to harm,” said Jim Cole, an ex-Department of Homeland Security agent who previously worked on the case.