39266
post-template-default,single,single-post,postid-39266,single-format-standard,stockholm-core-2.4,qodef-qi--no-touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,smooth_scroll,,qode_menu_,wpb-js-composer js-comp-ver-7.9,vc_responsive,elementor-default,elementor-kit-38031
Title Image

How Deepfakes Have Sunk Revenge Porn to New Depths

How Deepfakes Have Sunk Revenge Porn to New Depths

Revenge porn—which is when a person distributes a sexually explicit image or video of a another to harm them without their consent—has become a widespread issue in the last twelve years.[1] But what happens when it only looks like you’re in one of those videos, but it’s not actually you? In other words: what do you do when your deepfake is engaging in a sexual act for all the world to see?

Deepfakes—a portmanteau of “deep learning” and “fake”—are realistic videos that use artificial intelligence to create hyperreal videos of people doing and saying things that they’ve never done or said.[2] Deepfake technology uses deep learning to analyze large amounts of photos of a person’s face so that it knows the person’s facial structure well enough to superimpose it onto another human’s face.[3] Though the technology has been developing for some time, today’s deepfakes are so advanced that an actor being filmed in real time can, simply by moving his face, manipulate a video of President Bush speaking—and it’s hard to tell the difference.[4]

And though the technology is complex, the tools needed to make one’s own deepfake are readily accessible. On the subreddit r/SFWdeepfakes (which deems itself the “official” Deepfake subreddit), users recommend user-friendly programs for those who are new to the hobby.[5] These programs so accessible that they are either open source (meaning that anyone from the public can download the software for free) or they can even be as simple as an iPhone app, of which there are plenty.[6] As one artificial intelligence researcher aptly noted to Vice, “this is no longer rocket science.”[7] While pages like r/SFWdeepfakes are mostly just for people who want to superimpose Taylor Swift’s face onto top K-pop group BLACKPINK[8], some people have used deepfakes for darker purposes. Reports in 2017 emerged that people were superimposing celebrities’ faces onto actors’ faces in porn videos.[9]

And from there, it only got worse. Because of the accessibility of these deepfake sites and apps, users started flocking to subreddits—some with over 100,000 users—to share tips on how to superimpose their ex’s face onto porn videos.[10] Though Reddit has since banned these forums, the technology is still present and easy to access.[11] As a result, people’s (particularly, women’s) safety and privacy are at risk, as more people as people—whether it be an ex, an acquaintance, or a coworker— can easily make deepfake porn to blackmail or tarnish a reputation. This is a clear subset of revenge porn. Revenge porn is serious: it not only can cause significant emotional, psychological, and economic turmoil, but it often leads to stalking or other criminal acts leaving them in physical risk.[12]

Various legal organizations, academics, and media groups have flagged this risk and have called for federal and state response, noting that deepfakes pose a domestic violence issue that is more and more dangerous as it gets harder to distinguish reality from deepfake.[13]

But despite the attention and outrage, the response has been varied on the state and federal level. In 2019 Democrats introduced the DEEP FAKES Accountability Act in order to establish civil and criminal penalties for those who fail to get consent for the images they use,  but the bill did not make any headway.[14] To compensate for the lack of specific law, offenders could be prosecuted under 18 U.S.C. §§ 875, 1030, and 2261, which criminalize cyber ransom threats, computer hacking, and cyberstalking, respectively.[15] But there is no federal statute specifically on point. Similarly, no state has a law on point for deepfake revenge porn. However, unlike the federal government, states like California, Texas, Florida, and New York all criminalize distribution of revenge porn and cyberstalking.[16]

And though there are different federal and state routes a victim can take to seek remedy, there are multiple legal issues that stand in their way if they were harmed by deepfake revenge porn. The first is anonymity: it might be difficult for a victim to find the perpetrator if they uploaded it on a site that allows for anonymous use or if they used a software that allows them to use another IP address.[17] The second is that defining liability is an issue when there are two victims involved in deepfake revenge porn: the person who is actually engaging in the sex act and the person whose face is superimposed onto the person engaged in the sex act.[18] To satisfy these issues, there needs to be a statute that confronts the unique issue that is deepfake revenge porn.

Deepfakes will only get more and more complex in the future. Though the government moves characteristically slow, especially in relation to technology, there needs to be a push on both the state and federal level to protect people from the physical and emotional harm that can come from deepfakes. Only then will deterrence of deepfake revenge porn work.

Footnotes[+]

Shira Kindler

Shira Kindler is a second-year J.D. candidate at Fordham University School of Law and a staff member of the Intellectual Property, Media & Entertainment Law Journal. She holds a B.A. in Political Economy from the University of California, Berkeley.