Photo manipulation for NCII purposes predates generative AI. Traditional methods include head-splicing (attaching a real person's face to another body in Photoshop), body morphing, and removal of clothing through editing. Modern AI tools have made these manipulations dramatically faster and more realistic. The TAKE IT DOWN Act covers 'non-consensual intimate visual depictions' which includes manipulated imagery — not just authentically captured or purely AI-generated content. Any intimate visual depiction of a real identifiable individual produced without consent is covered.

Key facts about this term

  1. Manipulated imagery is covered the same as authentic NCII A Photoshop composite placing your face on a nude body is legally identical to an authentic intimate photo of you for purposes of the TAKE IT DOWN Act.
  2. The manipulation does not need to be AI-generated Traditional editing tools like Photoshop, GIMP, or video editing software can produce manipulated NCII. The legal standard is the resulting image, not the tool used.
  3. Biometric detection finds manipulated images Face-composited or head-spliced images retain the victim's facial geometry. Biometric face embedding comparison identifies these images even when the body is synthetic or belongs to someone else.

Frequently asked questions

What if the manipulation is obvious and clearly fake?

The TAKE IT DOWN Act does not require that the manipulation be realistic or convincing. A clearly fake composite is still covered if it depicts a real identifiable person in an intimate context without consent.

Is a meme that depicts me in a sexual context covered?

If the meme depicts you in an intimate or sexual context without consent, it may be covered depending on how 'intimate' is defined in the applicable statute. Consult a privacy attorney for specific cases.