The term “deepfake” first came into public consciousness around 2017, when an anonymous person on Reddit who went by the handle “Deepfakes” started the discussion forum “r/deepfakes.” The forum was devoted to videos featuring the faces of Hollywood actresses on the bodies of adult film stars. Similar forms of deepfake pornography quickly moved from fringe online sectors to more easily accessible and mainstream platforms.

In their March 2021 report, the threat intelligence company Sensity noted that “defamatory, derogatory, and pornographic fake videos account for 93%” of all deepfakes, most of which target women.

The implications of this kind of deepfakery are significant. As communication and legal scholar Danielle Citron writes, the weaponization of deepfakes against women “is terrifying, embarrassing, demeaning and silencing. Deepfake sex videos say to individuals that their bodies are not their own and can make it difficult to stay online, get or keep a job, and feel safe.”

References Nina Schick, Deepfakes: The Coming Infocalypse (New York: Twelve, 2020), 35-50; Giorgio Patrini, The State of Deepfakes 2020: Update on Statistics and Trends, Sensity, March 2021, 2-4; Danielle K. Citron, quoted in, Giorgio Patrini et. al., The State of Deepfakes: Landscape, Threats, and Impact, Deeptrace, September 2019, 6. Also, see Danielle K. Citron & Robert Chesney, Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security, 107 California Law Review 1753 (2019). Background theme image from Shutterstock.

8 of 25