Synthetic media encompasses more than deepfake videos. It includes AI-generated voice recordings, text-to-speech systems trained on someone's voice, entirely AI-generated images, and hybrid content that combines real and synthetic elements. In the context of NCII, synthetic media most commonly refers to AI-generated intimate images and face-swap videos. The TAKE IT DOWN Act is technology-neutral — it covers any visual depiction, synthetic or authentic, that meets its definition. Platform responsibilities apply equally to synthetic and authentic content.

Key facts about this term

  1. Synthetic media detection is a growing technical field Detection companies, academic researchers, and platforms are developing biometric analysis tools, metadata forensics, and model-fingerprinting techniques to identify synthetic media.
  2. The TAKE IT DOWN Act is technology-neutral Congress intentionally drafted broad language to ensure future AI technology developments are covered without requiring new legislation for each new modality.
  3. Voice-based synthetic media is not covered by NCII law The TAKE IT DOWN Act covers visual depictions only. AI voice clones used for harassment may be addressed under other statutes (fraud, impersonation, harassment) but not NCII-specific law.

Frequently asked questions

Is a voice clone synthetic media?

Yes. AI-generated voice recordings are a form of synthetic media. They are not covered by NCII-specific law but may be actionable under impersonation, fraud, or harassment statutes.

What platforms are required to remove synthetic intimate media?

All covered platforms under the TAKE IT DOWN Act — which includes any online platform with more than a minimal number of U.S. users that hosts user-generated content — must remove verified synthetic NCIVD within 48 hours.