The TAKE IT DOWN Act uses the term 'non-consensual intimate visual depiction' as its operative legal definition. The precision matters: 'visual depiction' extends beyond photographs to video, synthetic media, and AI-generated imagery. 'Intimate' is defined broadly to include nudity, sexual activity, and intimate contexts that a reasonable person would consider private. 'Non-consensual' means the depicted person did not give valid consent to the specific act of sharing, regardless of any prior consent to create the image. Platforms that fail to remove a verified NCIVD within 48 hours face civil and criminal liability.

Key facts about this term

  1. The NCIVD definition covers AI and synthetic media Congress specifically included AI-generated imagery in the NCIVD definition to ensure deepfakes and nudify-AI outputs are covered, even when no real intimate image of the person exists.
  2. Consent must be specific and informed Consent to create an image is not consent to share it. Consent to share in one context does not extend to other contexts. A valid NCIVD complaint exists whenever sharing occurs without specific consent to that act of sharing.
  3. NCIVD notices must meet statutory requirements A valid TAKE IT DOWN Act notice must identify the victim, identify the content, and certify non-consent. ScanErase prepares compliant notices on your behalf.

Frequently asked questions

Is there a difference between NCII and NCIVD?

NCII (non-consensual intimate imagery) is the policy and advocacy term. NCIVD (non-consensual intimate visual depiction) is the precise legal term used in the TAKE IT DOWN Act. They refer to the same category of content.

Does NCIVD cover written descriptions or audio?

No. The statutory definition is limited to visual depictions. Written descriptions or audio recordings may be covered by other laws (harassment, extortion) but not by the TAKE IT DOWN Act's removal mandate.