Deepfake pornography

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Deepfake pornography, or simply fake pornography, is a type of synthetic porn that is created via altering already-existing pornographic material by applying deepfake technology to the faces of the actor or actress. Deepfake porn has been very controversial, as it has been commonly used to place the faces of female celebrities onto porn performers' bodies, whose likeness is typically used without their consent.[1]


Deepfake pornography prominently surfaced on the Internet in 2017, particularly on Reddit.[2] The first one that captured attention was the Daisy Ridley deepfake, which was featured in several articles.[2] Other prominent pornographic deepfakes were of various other celebrities.[2][3][4][5] A report published in October 2019 by Dutch cybersecurity startup Deeptrace estimated that 96% of all deepfakes online were pornographic.[6]

In December 2017, Samantha Cole published an article about r/deepfakes in Vice that drew the first mainstream attention to deepfakes being shared in online communities.[7] Six weeks later, Cole wrote in a follow-up article about the large increase in AI-assisted fake pornography.[8] Since 2017, Samantha Cole of Vice has published a series of articles covering news surrounding deepfake pornography.[9][10][11][12][5][13][7][8]

Since then, multiple social media outlets have banned or made efforts to restrict deepfake pornography. Most notably, the r/deepfakes subreddit on Reddit was banned on February 7, 2018, due to the policy violation of "involuntary pornography".[14][15][16][17][18][19] In the same month, representatives from Twitter stated that they would suspend accounts suspected of posting non-consensual deepfake content.[13]

Scarlett Johansson, a frequent subject of deepfake porn, spoke publicly about the subject to The Washington Post in December 2018.[20] In a prepared statement, she expressed that despite concerns, she would not attempt to remove any of her deepfakes, due to her belief that they do not affect her public image and that differing laws across countries and the nature of internet culture make any attempt to remove the deepfakes "a lost cause".[20] While celebrities like herself are protected by their fame, however, she believes that deepfakes pose a grave threat to women of lesser prominence who could have their reputations damaged by depiction in involuntary deepfake pornography or revenge porn.[20]


In June 2019, a downloadable Windows and Linux application called DeepNude was released which used neural networks, specifically generative adversarial networks, to remove clothing from images of women. The app had both a paid and unpaid version, the paid version costing $50.[12] On June 27, the creators removed the application and refunded consumers, although various copies of the app, both free and for charge, continue to exist.[21] On GitHub, the open-source version of this program called "open-deepnude" was deleted.[22] The open-source version had the advantage of allowing to be trained on a larger dataset of nude images to increase the resulting nude image's accuracy level.[23]

Deepfake CSAM[edit]

Deepfake technology has made the creation of child sexual abuse material (CSAM), also often referenced to as child pornography, faster, safer and easier than it has ever been. Deepfakes can be used to produce new CSAM from already existing material or creating CSAM from children who have not been subjected to sexual abuse. Deepfake CSAM can however have real and direct implications on children including defamation, grooming, extortion, and bullying.[24] Additionally, deepfake child pornography produces further hurdles for police making criminal investigations and victim identification harder.[citation needed]

Ethical debate[edit]

Deepfake pornography software can be misused to create pseudo revenge porn on an individual, which can be deemed a form of harassment.[21]

Currently the footage produced by software such as DeepNude is still far from sophisticated enough to be indistinguishable from real footage under forensic analysis.[23][clarification needed]

Efforts by companies to limit deepfake pornography footage[edit]

On January 31, 2018, Gfycat began removing all deepfakes from its site.[11][25]

In February 2018, r/deepfakes was banned by Reddit for sharing involuntary pornography.[19] Other websites have also banned the use of deepfakes for involuntary pornography, including the social media platform Twitter and the pornography site Pornhub.[13] However, some websites have not banned deepfake content, including 4chan and 8chan.[26]

Also in February 2018, Pornhub said that it would ban deepfake videos on its website because it is considered "non consensual content" which violates their terms of service.[10] They also stated previously to Mashable that they will take down content flagged as deepfakes.[27] In a 2018 article, writers from Motherboard reported that searching "deepfakes" on Pornhub still returned multiple recent deepfake videos.[10]

The chat site Discord has taken action against deepfakes in the past,[28] and has taken a general stance against deepfakes.[25][29] In September 2018, Google added "involuntary synthetic pornographic imagery" to its ban list, allowing anyone to request the block of results depicting their fake nudes.[30]


  1. ^ Dickson, E. J. (2019-10-07). "Deepfake Porn Is Still a Threat, Particularly for K-Pop Stars". Rolling Stone. Retrieved 2019-11-09.
  2. ^ a b c Roettgers, Janko (2018-02-21). "Porn Producers Offer to Help Hollywood Take Down Deepfake Videos". Variety. Retrieved 2018-02-28.
  3. ^ Goggin, Benjamin. "From porn to 'Game of Thrones': How deepfakes and realistic-looking fake videos hit it big". Business Insider. Retrieved 2019-11-09.
  4. ^ Lee, Dave (2018-02-03). "'Fake porn' has serious consequences". Retrieved 2019-11-09.
  5. ^ a b Cole, Samantha (2018-06-19). "Gfycat's AI Solution for Fighting Deepfakes Isn't Working". Vice. Retrieved 2019-11-09.
  6. ^ "The State of Deepfake - Landscape, Threats, and Impact" (PDF). Deeptrace. 2019-10-01. Retrieved 2020-07-07.
  7. ^ a b Cole, Samantha (11 December 2017). "AI-Assisted Fake Porn Is Here and We're All Fucked". Vice. Retrieved 19 December 2018.
  8. ^ a b Cole, Samantha (2018-01-24). "We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now". Vice. Retrieved 2019-12-15.
  9. ^ Cole, Samantha (2019-06-11). "This Deepfake of Mark Zuckerberg Tests Facebook's Fake Video Policies". Vice. Retrieved 2019-12-15.
  10. ^ a b c Cole, Samantha (2018-02-06). "Pornhub Is Banning AI-Generated Fake Porn Videos, Says They're Nonconsensual". Vice. Retrieved 2019-11-09.
  11. ^ a b Cole, Samantha (2018-01-31). "AI-Generated Fake Porn Makers Have Been Kicked Off Their Favorite Host". Vice. Retrieved 2019-11-18.
  12. ^ a b Cole, Samantha; Maiberg, Emanuel; Koebler, Jason (26 June 2019). "This Horrifying App Undresses a Photo of Any Woman with a Single Click". Vice. Retrieved 2 July 2019.
  13. ^ a b c Cole, Samantha (2018-02-06). "Twitter Is the Latest Platform to Ban AI-Generated Porn". Vice. Retrieved 2019-11-08.
  14. ^ Böhm, Markus (2018-02-07). ""Deepfakes": Firmen gehen gegen gefälschte Promi-Pornos vor". Spiegel Online. Retrieved 2019-11-09.
  15. ^ barbara.wimmer (8 February 2018). "Deepfakes: Reddit löscht Forum für künstlich generierte Fake-Pornos". (in German). Retrieved 2019-11-09.
  16. ^ online, heise. "Deepfakes: Auch Reddit verbannt Fake-Porn". heise online (in German). Retrieved 2019-11-09.
  17. ^ "Reddit verbannt Deepfake-Pornos -". DER STANDARD (in Austrian German). Retrieved 2019-11-09.
  18. ^ Robertson, Adi (2018-02-07). "Reddit bans 'deepfakes' AI porn communities". The Verge. Retrieved 2019-11-09.
  19. ^ a b Kharpal, Arjun (2018-02-08). "Reddit, Pornhub ban videos that use A.I. to superimpose a person's face over an X-rated actor". CNBC. Retrieved 2018-02-20.
  20. ^ a b c "Scarlett Johansson on fake AI-generated sex videos: 'Nothing can stop someone from cutting and pasting my image'". The Washington Post. 2018-12-31. Retrieved 2019-06-19.
  21. ^ a b "DeepNude AI copies easily accessible online". 3 July 2019.
  22. ^ Cox, Joseph (July 9, 2019). "GitHub Removed Open Source Versions of DeepNude". Vice Media.
  23. ^ a b "DeepNude is back".
  24. ^ Kirchengast, T (2020). "Deepfakes and image manipulation: criminalisation and control". Information & Communications Technology Law. 29 (3): 308–323. doi:10.1080/13600834.2020.1794615. S2CID 221058610.
  25. ^ a b Ghoshal, Abhimanyu (2018-02-07). "Twitter, Pornhub and other platforms ban AI-generated celebrity porn". The Next Web. Retrieved 2019-11-09.
  26. ^ Hathaway, Jay (8 February 2018). "Here's where 'deepfakes,' the new fake celebrity porn, went after the Reddit ban". The Daily Dot. Retrieved 22 December 2018.
  27. ^ Gilmer, Damon Beres and Marcus (2 February 2018). "A guide to 'deepfakes,' the internet's latest moral crisis". Mashable. Retrieved 2019-11-09.
  28. ^ Price, Rob (2018-01-27). "Discord just shut down a chat group dedicated to sharing porn videos edited with AI to include celebrities". Business Insider Australia. Retrieved 2019-11-28.
  29. ^ "Twitter bans 'deepfake' AI-generated porn". Engadget. Retrieved 2019-11-28.
  30. ^ Harrell, Drew. "Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'". The Washington Post. Retrieved 2019-01-01.