Generative AI has transformed image creation. Tools like Stable Diffusion, Midjourney, and DALL-E can produce photorealistic images of fictional people or, when prompted with reference photos, realistic depictions of real individuals in any scenario. Applied to intimate content, this technology has created an explosion of NCII that did not require stealing or recording any authentic intimate image. Congress addressed this directly in the TAKE IT DOWN Act, which explicitly includes AI-generated and synthetic imagery in its definition of covered content.

Key facts about this term

  1. AI imagery depicting real people without consent is NCII A photorealistic AI image that depicts an identifiable real person in an intimate or sexual context — even when entirely synthetic — is a non-consensual intimate visual depiction under the TAKE IT DOWN Act.
  2. Only a public photo of your face is required Perpetrators need nothing more than a profile photo to generate realistic AI intimate imagery of a target. No intimate images of the victim need to exist.
  3. Covered platforms must remove it within 48 hours Once ScanErase sends a valid 223a notice identifying the AI-generated NCII and certifying non-consent, covered platforms have 48 hours to remove the content.

Frequently asked questions

Can I tell if an image of me is AI-generated or a real photo?

High-quality AI images are increasingly difficult to distinguish from real photographs. ScanErase's detection system uses biometric face embedding comparison rather than visual inspection, so the distinction is less important for finding and removing the content.

Are all AI-generated intimate images of real people illegal?

Under the TAKE IT DOWN Act, an AI-generated intimate image of a real, identifiable person is covered if it meets the definition of NCIVD and was created or shared without consent. Fictional characters and purely synthetic persons are not covered.