Researchers Dr. Clare McGlynn and Dr. Erika Rackley coined image-based sexual abuse as a term to accurately describe a continuum of harm that includes: upskirting and surreptitious recording; threats to share intimate images; actual non-consensual distribution; and deepfake or AI-generated content. The term 'abuse' is intentional — it frames the conduct as a form of sexual violence, not merely an embarrassing incident. This framing influences how courts, legislators, and support services respond. In the U.S., the TAKE IT DOWN Act adopts some of this framing by including AI-generated content and creating mandatory removal timelines.

Key facts about this term

  1. IBSA is a continuum of harm, not a single act Image-based sexual abuse begins with non-consensual creation and extends through threats, distribution, and repeated resharing. Each stage is distinct and may involve different legal remedies.
  2. The abuse framing is legally significant Courts that treat IBSA as a form of sexual violence rather than a privacy tort tend to award higher damages and impose stricter platform liability standards.
  3. International law also uses the IBSA framework Australia, the UK, and Canada all have IBSA legislation that uses similar definitions. International removal requests may rely on this framework when content is hosted abroad.

Frequently asked questions

Is image-based sexual abuse the same as revenge porn?

They overlap significantly but IBSA is broader. Revenge porn implies an ex-partner motive, while IBSA covers all perpetrators and all stages of harm including threats and AI content.

Does ScanErase address all forms of IBSA?

ScanErase's core service addresses the distribution component — finding and removing posted intimate imagery. For surreptitious recording or threats, law enforcement action is also recommended.