What the Act covers

The TAKE IT DOWN Act criminalises the publication or distribution of non-consensual intimate images (NCII), including AI-generated or manipulated deepfakes. It applies to any platform, service, or individual that hosts, publishes, or distributes such content within US jurisdiction or accessible to US persons. The 2026 version was specifically expanded from earlier proposals to cover AI-generated content after the explosive growth of synthetic intimate imagery following 2024.

The 48-hour removal mandate

Upon receiving a valid takedown notice citing the Act, covered platforms are legally obligated to remove the identified content within 48 hours. Failure to comply exposes the platform to federal civil and criminal liability under 47 U.S.C. § 223a. Civil penalties reach up to $150,000 per violation. Criminal liability attaches for knowing and intentional violations, with penalties of up to 10 years imprisonment. The Federal Trade Commission is the primary enforcement authority.

Extraterritorial application

The Act's provisions apply to any platform that is accessible from the United States, regardless of where the platform is headquartered. This effectively extends federal enforcement to the vast majority of the global internet. Offshore image boards, messaging platforms, and adult content sites that serve US users are all covered.

Private right of action

Affected individuals may bring a civil action in any district court of the United States against any person or platform that violates the Act. Successful plaintiffs may recover actual damages, statutory damages of up to $150,000, injunctive relief, and attorney's fees. This private right of action exists alongside, not instead of, FTC enforcement.

What constitutes a valid notice

A valid removal notice under 47 U.S.C. § 223a must identify the infringing URL or content location, the affected individual, the statutory basis for removal, and include a representation that the filer has a good-faith belief the content is covered by the Act. ScanErase generates notices that meet all formal requirements and dispatches them simultaneously to all platforms where your content has been identified.

Frequently asked questions

What is the TAKE IT DOWN Act?
The TAKE IT DOWN Act (Pub. L. 119-12) is a 2026 federal law that criminalises the publication or distribution of non-consensual intimate images including AI-generated deepfakes. It requires covered platforms to remove such content within 48 hours of receiving a compliant removal notice. Failure to comply exposes platforms to civil penalties of up to $150,000 per violation.
Who does the TAKE IT DOWN Act apply to?
The Act applies to any electronic communication service or interactive computer service that hosts user-generated content and is accessible from the United States, regardless of where the platform is headquartered. This covers the vast majority of social media platforms, image hosts, messaging services, and adult content sites.
Does the TAKE IT DOWN Act cover AI-generated deepfakes?
Yes. The 2026 version explicitly covers synthetically generated or altered visual depictions. A deepfake of a real person in a sexually explicit context is treated identically to an authentic photograph under the Act.
How do I file a removal notice under the TAKE IT DOWN Act?
A valid removal notice under 47 U.S.C. § 223a must identify the infringing URL, the affected individual, and the statutory basis for removal. ScanErase generates notices that meet all formal requirements and dispatches them simultaneously to all platforms where your content has been identified.
What happens if a platform ignores a TAKE IT DOWN Act notice?
Non-compliance exposes the platform to civil liability of up to $150,000 per violation. The FTC is the primary enforcement authority. Affected individuals also have a private right of action in federal court to recover actual damages, statutory damages, and attorney's fees.