AI impersonation takes multiple forms: deepfake videos that make someone appear to say or do things they never did; voice clones used in scam calls; AI-generated social media profiles designed to impersonate a real person; and intimate imagery generated using someone's likeness. The remedies vary by form. Intimate AI imagery is covered by the TAKE IT DOWN Act. Non-intimate impersonation for fraud may be covered by identity theft or impersonation statutes. Defamatory AI-generated content may support a defamation claim.

Key facts about this term

  1. Intimate AI impersonation is covered by NCII law AI-generated intimate imagery that depicts a real, identifiable individual is covered by the TAKE IT DOWN Act, requiring 48-hour platform removal upon notice.
  2. Non-intimate AI impersonation has separate remedies Impersonation on social media, AI-generated fraud calls, and fake profiles may be addressed through platform policies, state impersonation laws, and fraud statutes.
  3. The right to one's own likeness is legally protected Right of publicity laws in most U.S. states protect individuals from unauthorized commercial use of their likeness. This is a separate but complementary protection to NCII law.

Frequently asked questions

Is it legal for someone to create an AI of my face without my consent?

Using AI to generate intimate imagery of a real identifiable person is illegal under the TAKE IT DOWN Act. Non-intimate AI impersonation may be covered by other statutes depending on the purpose and harm.

What if someone is using my face to run a social media account?

Report to the platform's impersonation policy enforcement team. Most major platforms have explicit policies against impersonation and will remove fake accounts upon report.