Skip to main content

April 9, 2026

Take It Down Act's first conviction: James Strahler

NBC News
www.10tv.com
www.multistate.ai
dlawgroup.com
The 19th
+20

First convicted under deepfake law criminalizing AI sexual abuse

The Take It Down Act became federal law on May 19, 2025, making it illegal to knowingly publish intimate visual depictions of people without their consent. Senators Ted Cruz (R-TX) and Amy KlobucharAmy Klobuchar (D-MN) sponsored the bipartisan bill after a 2023 incident where Aledo, Texas high school students had nude deepfakes created and shared without permission. The Senate passed the bill unanimously in February 2025, and the House passed it 409-2 in April 2025. The law imposes criminal penalties of up to two years in prison for offenses involving adults, and up to three years for offenses involving minors. Platforms must remove reported intimate imagery within 48 hours or face FTC enforcement action.

James Strahler II, 37, of Columbus, Ohio pleaded guilty on April 7, 2026 in U.S. District Court before Chief Judge Sarah D. Morrison, making him the first person convicted under the Take It Down Act. Strahler admitted to three federal crimes: cyberstalking, producing obscene visual representations of child sexual abuse material, and publication of digital forgeries. Between December 2024 and June 2025, Strahler conducted a coordinated campaign targeting six women and multiple children. The U.S. Attorney's Office for the Southern District of Ohio, led by Dominick S. Gerace II, prosecuted the case. Strahler was arrested in June 2025 and awaits sentencing.

Strahler installed 24 or more AI platforms and used over 100 web-based AI models on his phone to generate explicit content. Investigators recovered 2,400 images and videos from his devices. He created more than 700 images total, many posted to a website dedicated to child sexual abuse. The speed of AI image generation allowed Strahler to create content quickly and in volume.

Strahler targeted his victims methodically using telephone calls, voicemails, text messages, and web postings. He attacked at least six adult women by creating and distributing nude deepfakes to their coworkers, family members, and online. In one case, he created an AI-generated video depicting an adult victim engaged in explicit sexual acts, then circulated the video to damage her reputation. Against child victims, Strahler used AI to place the faces of local boys onto explicit sexual content.

The federal definition of digital forgeries in the Take It Down Act includes any image or video created, adapted, or manipulated by artificial intelligence. AI-generated sexual images can be created from text descriptions alone. The law doesn't prohibit creating deepfakes generally, which would raise free speech concerns. It criminalizes the publication of intimate depictions without consent.

The Take It Down Act shifted how federal law treats online platforms. Under Section 230 of the Communications Decency Act (1996), platforms historically received broad immunity from liability for user-generated content. The Take It Down Act carves out an exception requiring platforms to actively comply with takedown requests for intimate imagery within 48 hours. The FTC will enforce the Act's requirements, treating platform non-compliance as unfair or deceptive practices.

Strahler's arrest followed investigation by the FBI, working with local Columbus police. Investigators had to establish that images were AI-generated rather than real, trace Strahler's use of multiple AI platforms, and prove he distributed the content knowingly. Digital forensics teams analyzed his devices to recover deleted images and communications. The evidence set legal precedent for how federal prosecutors can build cases involving AI-generated content.

Melania Trump publicly celebrated Strahler's conviction as a major victory for the Take It Down Act and her Be Best initiative. Her advocacy beginning in 2024 helped build support across party lines. The bipartisan nature of the law meant the issue transcended typical partisan divisions.

The Take It Down Act is one of the first major federal laws to regulate artificial intelligence directly. The unanimous Senate passage in February 2025 signaled congressional determination to act on AI harms. However, the law takes a narrow approach targeting a specific harm rather than regulating AI development broadly. Critics argue this reactive approach treats symptoms rather than causes.

The Strahler conviction coincides with broader deepfake regulation momentum. In January 2026, the Senate passed the DEFIANCE Act by unanimous consent, allowing victims to sue for a minimum of $150,000 or $250,000 if the deepfake involves assault or stalking. At the state level, 28 states have enacted laws addressing deepfakes as of early 2026.

🏛️GovernmentCivil Rights🔒Digital Rights🏥Public Health

People, bills, and sources

James Strahler II

Defendant, first person convicted under Take It Down Act

Ted Cruz

U.S. Senator (R-TX); co-sponsor of Take It Down Act

Amy Klobuchar

Amy Klobuchar

U.S. Senator (D-MN); co-sponsor of Take It Down Act

Melania Trump

First Lady; advocate for Take It Down Act

Donald Trump

Donald Trump

President of the United States

Dominick S. Gerace II

U.S. Attorney, Southern District of Ohio

Sarah D. Morrison

Chief U.S. District Judge, Southern District of Ohio

What you can do

1

civic action

Report non-consensual intimate imagery immediately

Reporting triggers platform removal and federal investigation under the Take It Down Act.

I'm reporting non-consensual intimate imagery or AI deepfakes. Under the Take It Down Act, platforms must remove this within 48 hours.

2

civic action

Contact your U.S. Representative about broader AI regulation

The Take It Down Act addresses one specific harm. Congress needs to hear about broader AI safeguards.

I'm calling about AI-generated sexual content. The Take It Down Act criminalizes publication, but doesn't prevent creation or platform misuse. I want stronger regulation of AI tool development.