Deepfakes and Minors
Deepfake NCII targeting minors has emerged as a distinct and rapidly growing crisis. NCMEC received over 26,000 sextortion reports involving minors in 2023, and school-based deepfake incidents increased 300% between 2023 and 2025.
The targeting of minors with AI-generated intimate imagery represents one of the most urgent dimensions of the deepfake NCII crisis. Consumer-facing nudify apps with minimal age verification have enabled classmates to generate fake nudes of fellow students from school yearbook photos. High-profile incidents in Almeria, Spain; Beverly Hills, California; and Westfield, New Jersey drew national attention. The TAKE IT DOWN Act specifically addresses minors, requiring platforms to prioritize removal of minors' intimate imagery. NCMEC's Take It Down program provides hash-based prevention specifically designed for minor victims.
Key facts about this term
-
Intimate images of minors have dual legal coverage AI-generated intimate images of minors are covered by both NCII law (TAKE IT DOWN Act) and existing child sexual abuse material (CSAM) statutes, which carry significantly higher penalties.
-
NCMEC's Take It Down provides hash-based prevention for minors Minor victims (or their parents) can create hashes of intimate images without sharing them, enabling participating platforms to automatically block re-uploads.
-
Schools have reporting and response obligations Most state education laws require schools to address student-on-student sexual harassment, which includes deepfake NCII. Schools may also have mandatory reporting obligations to law enforcement.
Frequently asked questions
What should parents do if their child is targeted by deepfake NCII?
Contact law enforcement immediately. Do not require the child to view or confirm the content. Contact NCMEC's Take It Down program. Seek trauma-informed counseling for the child. Contact the school district's Title IX coordinator.
Can classmates be prosecuted for creating deepfakes of fellow students?
Yes. The TAKE IT DOWN Act applies regardless of the perpetrator's age. Additionally, minors who create CSAM-equivalent content — intimate imagery of other minors — may face juvenile justice proceedings.
Find and remove your photos now
Upload a photo. We scan 2.4 billion face embeddings and send legal removal notices in 48 hours.
Remove photos now
ScanErase