Are there laws specifically banning deepfake pornography?

Yes, but only in specific contexts. The U.S. TAKE IT DOWN Act criminalizes nonconsensual intimate deepfakes at the federal level, and the UK made creating such images a criminal offence in January 2026 — but no country has a blanket ban on all deepfake pornography.

Laws targeting deepfake pornography have expanded rapidly, but they remain narrowly scoped. In the United States, the TAKE IT DOWN Act established a federal crime for distributing nonconsensual intimate imagery, including AI-generated deepfakes. At the state level, lawmakers in all 50 states introduced some form of sexual deepfake legislation in 2025, according to MultiState. AI-generated child sexual abuse material (CSAM) is illegal under federal law and explicitly criminalized in 45 states.

In the UK, Technology Secretary Liz Kendall announced in January 2026 that creating nonconsensual intimate deepfake images is now a specific criminal offence under the Data (Use and Access) Act. This followed earlier UK legislation targeting the sharing of such images. Similar laws exist across dozens of countries, though enforcement and scope vary significantly by jurisdiction.

Critically, these laws target nonconsensual or exploitative use cases — they do not prohibit all AI-generated sexual or intimate content. Consensual deepfake content, satire, and entertainment uses generally remain legal in most jurisdictions. Traverse Legal confirms there is no single law banning all deepfakes across all use cases, meaning the legal landscape is a patchwork of targeted prohibitions rather than a comprehensive global ban.

Read the full analysis