The term “deepfake” first came into public consciousness around 2017, when an anonymous person on Reddit who went by the handle “Deepfakes” started the discussion forum “r/deepfakes.” The forum was devoted to videos featuring the faces of Hollywood actresses on the bodies of adult film stars. Similar forms of deepfake pornography quickly moved from fringe online sectors to more easily accessible and mainstream platforms.
In their March 2021 report, the threat intelligence company Sensity noted that “defamatory, derogatory, and pornographic fake videos account for 93%” of all deepfakes, most of which target women.
The implications of this kind of deepfakery are significant. As communication and legal scholar Danielle Citron writes, the weaponization of deepfakes against women “is terrifying, embarrassing, demeaning and silencing. Deepfake sex videos say to individuals that their bodies are not their own and can make it difficult to stay online, get or keep a job, and feel safe.”