Ohio Man Convicted in First Case Under New Federal 'Take It Down Act'
A 37-year-old Ohio man has been convicted under a new federal law for creating and distributing abusive images, including those generated by artificial intelligence. The case marks the first conviction under the 2025 Take It Down Act, which prohibits the publication of nonconsensual intimate digital content.
Case Details and Conviction
James Strahler, 37, pleaded guilty to charges of cyberstalking, producing obscene visual representations of child sexual abuse, and publication of digital forgeries. According to the U.S. Attorney’s Office for the Southern District of Ohio, the crimes involved both real and AI-generated images.
Court documents state that Strahler used dozens of AI platforms and over 100 web-based models to create more than 700 illicit images. These images were posted to a website dedicated to child sexual abuse material.
Strahler was identified after an adult victim reported receiving threatening messages. A subsequent investigation and information from his seized phone revealed additional victims.
Legal Context: The Take It Down Act
The conviction is the first under the federal 2025 Take It Down Act. The law makes it illegal to publish nonconsensual, intimate digital content.
Expert Perspectives on Enforcement Challenges
Experts cited in the case filings and related reporting highlight significant challenges for law enforcement and legal systems in addressing such crimes.
- Volume and Accessibility of Content: Kolina Koltai, a senior researcher at Bellingcat, stated that the volume of content created by offenders like Strahler is not unusual. Koltai noted that current AI tools require minimal technical knowledge compared to earlier editing software, making content creation more accessible and contributing to an overwhelming amount of material for investigators.
- Proliferation of Platforms: Koltai also stated that numerous platforms exist for creating deepfake material. These sites often operate through multiple domain extensions to avoid being taken offline, creating a persistent challenge where shutting down one site can lead to others appearing.
- Pace of Technological Change: Matthew Faranda-Diedrich, an attorney who handles deepfake cases, stated that AI technology has transitioned from obscure to mainstream faster than laws have adapted. He reported that cases involving such material have increased from virtually none two years ago to him currently handling five or six such cases at any given time.
Research on Victim Demographics and Scope
Available research and expert testimony provide context on the scope of these crimes:
- Research indicates the distribution of nonconsensual deepfakes is particularly prevalent among young people.
- Women and girls represent approximately 90% of victims in these crimes, according to available data.
- Matthew Faranda-Diedrich stated that in most of the cases he has handled, both the victims and the perpetrators have been children aged 14 to 16. He emphasized that schools have a responsibility to involve law enforcement when such technology is used to create abusive material.