Sunday, July 14, 2024
HomeTechnologyGadgetsDOJ makes first arrest over AI-generated CSAM

DOJ makes first arrest over AI-generated CSAM

The U.S. Department of Justice arrested a Wisconsin man last week on charges of producing and distributing artificial intelligence-generated child sexual abuse material (CSAM). To our knowledge, this is the first case of its kind, as the Department of Justice hopes to establish a legal precedent that exploitative material remains illegal even if no children were used to create it. “Simply put, AI-generated CSAM is still CSAM,” said Deputy Attorney General Lisa Monaco. wrote in a press release.

The U.S. Department of Justice said that Steven Anderegg, a 42-year-old software engineer from Holmen, Wisconsin, used a fork of the open source AI image generator stable diffusion Producing these images, he then used them to attempt to lure an underage boy into engaging in sexual acts. The latter is likely to play a central role in the eventual trial on four counts of “producing, distributing and possessing obscene visual depictions of minors engaging in sexually explicit conduct and transmitting obscene material to minors under 16 years of age.”

The government said Anderegg’s images showed “naked or semi-nude minors lewdly displaying or touching their genitals or engaging in sexual intercourse with men.” The DOJ claimed he used specific cues, including negative cues (for artificial Additional guidance from the smart model, telling it what to do) no Production) stimulates generators to create CSAM.

Cloud-based image generators such as halfway and Dahl-E 3 There are safeguards in place against this type of activity, but technical art Report Anderegg allegedly uses Stable Diffusion 1.5, a variant with fewer borders. Stability AI told the publication that the fork was produced by Runway ML.

According to the Justice Department, Anderegg communicated with the 15-year-old boy online and described how he used artificial intelligence models to create images. The agency said the defendants sent direct messages to teenagers on Instagram, including several artificial intelligence images of “minors lewdly displaying their genitals.”To its credit, Instagram National Center for Missing and Exploited Children (NCMEC), which attracted the attention of law enforcement.

If convicted on all four counts, Anderegg could face 5 to 70 years in prison. He is currently in federal custody pending a hearing scheduled for May 22.

The case will challenge what some may believe is the illegal nature of CSAM based solely on the exploitation of children in its creation. Although AI-generated digital CSAM does not involve any living people (other than the person entering the prompt), it can still normalize and encourage the material, or be used to lure children into predatory situations.This seems to be something the federal government wants to clarify because technology Rapidly developing and becoming increasingly popular.

“Technology may change, but our commitment to protecting children does not,” Monaco’s deputy attorney general wrote. “The Department of Justice will aggressively pursue those who produce and distribute child sexual abuse material (CSAM), regardless of how the material is produced. In short, AI-generated CSAM is still CSAM, and we will pursue those who use AI to create Responsibility for obscene, abusive and increasingly realistic images of children.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments