From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Deepfake (a portmanteau of "deep learning" and "fake"[1]) is a technique for human image synthesis based on artificial intelligence. It is used to combine and superimpose existing images and videos onto source images or videos using a machine learning technique known as generative adversarial network.[2] The phrase "deepfake" was coined in 2017.

Because of these capabilities, deepfakes have been used to create fake celebrity pornographic videos or revenge porn.[3] Deepfakes can also be used to create fake news and malicious hoaxes.[4][5]


The development of deepfakes has taken place to a large extent in two settings: research at academic institutions, and development by amateurs in online communities.

Academic research[edit]

Academic research related to deepfakes lies predominantly within the field of computer vision, a subfield of computer science often grounded in artificial intelligence that focuses on computer processing of digital images and videos. An early landmark project was the Video Rewrite program, published in 1997, which modified existing video footage of a person speaking to depict that person mouthing the words contained in a different audio track.[6] It was the first system to fully automate this kind of facial reanimation, and it did so using machine learning techniques to make connections between the sounds produced by a video’s subject and the shape of their face.

Contemporary academic projects have focused on creating more realistic videos and on making techniques simpler, faster, and more accessible. The “Synthesizing Obama” program, published in 2017, modifies video footage of former president Barack Obama to depict him mouthing the words contained in a separate audio track.[7] The project lists as a main research contribution its photorealistic technique for synthesizing mouth shapes from audio. The Face2Face program, published in 2016, modifies video footage of a person’s face to depict them mimicking the facial expressions of another person in real time.[8] The project lists as a main research contribution the first method for reenacting facial expressions in real time using a camera that does not capture depth, making it possible for the technique to be performed using common consumer cameras.

Amateur development[edit]

The term deepfakes originated around the end of 2017 from a Reddit user named "deepfakes."[9] He, as well as others in the Reddit community r/deepfakes, shared deepfakes they created; many videos involved celebrities’ faces swapped onto the bodies of actresses in pornographic videos,[9] while non-pornographic content included many videos with actor Nicolas Cage’s face swapped into various movies.[10] In December 2017, Samantha Cole published an article about r/deepfakes in Vice that drew the first mainstream attention to deepfakes being shared in online communities.[11] Six weeks later, Cole wrote in a follow-up article about the large increase in AI-assisted fake pornography.[9] In February 2018, r/deepfakes was banned by Reddit for sharing involuntary pornography, and other websites have also banned the use of deepfakes for involuntary pornography, including the social media platform Twitter and the pornography site Pornhub.[12] Other online communities remain, however, including Reddit communities that do not share pornography, such as r/SFWdeepfakes (short for "safe for work deepfakes"), in which community members share deepfakes depicting celebrities, politicians, and others in non-pornographic scenarios.[13] Other online communities continue to share pornography on platforms that have not banned deepfake pornography.[12]


Deepfake pornography surfaced on the Internet in 2017, particularly on Reddit,[14] and has been banned by sites including Reddit, Twitter, and Pornhub.[15][16][17] In autumn 2017, an anonymous Reddit user under the pseudonym "deepfakes" posted several porn videos on the Internet. The first one that captured attention was the Daisy Ridley deepfake. It was also one of the more known deepfake videos, and a prominent feature in several articles. Another one was a deepfake simulation of Gal Gadot having sex with her step-brother, while others were of celebrities like Emma Watson, Katy Perry, Taylor Swift or Scarlett Johansson. The scenes were not real, having been created with artificial intelligence. They were debunked a short time later.

As time went on, the Reddit community fixed many bugs in the faked videos, making it increasingly difficult to distinguish fake from true content. Non-pornographic photographs and videos of celebrities, which are readily available online, were used as training data for the software. The deepfake phenomenon was first reported in December 2017 in the technical and scientific section of the magazine Vice, leading to its widespread reporting in other media.[18][19]

Scarlett Johansson, a frequent subject of deepfake porn, spoke publicly about the subject to The Washington Post in December 2018. In a prepared statement, she expressed concern about the phenomenon, describing the internet as a "vast wormhole of darkness that eats itself." However, she also stated that she wouldn't attempt to remove any of her deepfakes, due to her belief that they don't affect her public image and that differing laws across countries and the nature of internet culture make any attempt to remove the deepfakes "a lost cause"; she believes that while celebrities like herself are protected by their fame, deepfakes pose a grave threat to women of lesser prominence who could have their reputations damaged by depiction in involuntary deepfake pornography or revenge porn.[20]

In the United Kingdom, producers of deepfake material can be prosecuted for harassment, but there are calls to make deepfake a specific crime;[21] in the United States, where charges as varied as identity theft, cyberstalking, and revenge porn have been pursued, the notion of a more comprehensive statute has also been discussed.[22]


Deepfakes have been used to misrepresent well-known politicians on video portals or chatrooms. For example, the face of the Argentine President Mauricio Macri was replaced by the face of Adolf Hitler, and Angela Merkel's face was replaced with Donald Trump's.[23][24] In April 2018, Jordan Peele and Jonah Peretti created a deepfake using Barack Obama as a public service announcement about the danger of deepfakes.[25] In January 2019, Fox television affiliate KCPQ aired a deepfake of Trump during his Oval Office address, mocking his appearance and skin color.[26]

Deepfake software[edit]

In January 2018, a desktop application called FakeApp was launched. The app allows users to easily create and share videos with faces swapped. The app uses an artificial neural network and the power of the graphics processor and three to four gigabytes of storage space to generate the fake video. For detailed information, the program needs a lot of visual material from the person to be inserted in order to learn which image aspects have to be exchanged, using the deep learning algorithm based on the video sequences and images.

The software uses the AI-Framework TensorFlow of Google, which among other things was already used for the program DeepDream. Celebrities are the main subjects of such fake videos, but other people also appear.[27][28][29] In August 2018, researchers at the University of California, Berkeley published a paper introducing a fake dancing app that can create the impression of masterful dancing ability using AI.[30][31]

There are also open-source alternatives to the original FakeApp program, like DeepFaceLab,[32] FaceSwap (currently hosted on GitHub)[33] and myFakeApp (currently hosted on Bitbucket).[34][35]



The Aargauer Zeitung says that the manipulation of images and videos using artificial intelligence could become a dangerous mass phenomenon. However, the falsification of images and videos is even older than the advent of video editing software and image editing programs; in this case it is the realism that is a new aspect.[23]

It is also possible to use deepfakes for targeted hoaxes and revenge porn.[36][37]

Effects on credibility and authenticity[edit]

An effect of deepfakes is that it can no longer be distinguished whether content is targeted (e.g. satire) or genuine. AI researcher Alex Champandard has said everyone should know how fast things can be corrupted today with this technology, and that the problem is not a technical one, but rather one to be solved by trust in information and journalism. The primary pitfall is that humanity could fall into an age in which it can no longer be determined whether a medium's content corresponds to the truth.[23]

Internet reaction[edit]

Some websites, such as Twitter and Gfycat, announced that they would delete deepfake content and block its publishers. Previously, the chat platform Discord blocked a chat channel with fake celebrity porn videos. The pornography website, Pornhub, also plans to block such content; however, it has been reported that the site has not been enforcing its ban.[38][39] At Reddit, the situation initially remained unclear until the subreddit was suspended on February 7, 2018, due to the policy violation of "involuntary pornography".[19][40][41][42] In September 2018, Google added "involuntary synthetic pornographic imagery” to its ban list, allowing anyone to request the block of results showing their fake nudes.[43]

Popular culture[edit]

Picaper by Jack Wodhams[edit]

The 1986 Mid-December issue of Analog magazine published Picaper by Jack Wodhams. Its plot revolves around digitally enhanced or digitally generated videos produced by skilled hackers serving unscrupulous lawyers and political figures. The main protagonist, a troubleshooter for the Systems Monitoring Department, comes across an unusually high-quality fake video.[44]

Jack Wodhams calls such fabricated videos picaper or mimepic — image animation based on "the information from the presented image, and copied through choices from an infinite number of variables that a program might supply".

For instance, someone gets a record of you walking and talking, your facial expressions. These impressions can be broken down to their individual matrix composites, can be analyzed, rearranged, and can then be extrapolated through known standard human behavioral patterns, so that an image of you may be re-projected doing and saying things that you have not in fact done or said.[44]

To Wodhams, pornography is not the major danger of this technology.

There is just a chance that this could be a matter of national interest. Something bigger than pornography or unlicensed alterations to copyrighted portrayals.[44]

In the story, interactive fake video is injected in a video conference call.

These days an actual gathering of a team around a table had become very much a rarity. It was so much simpler, and more convenient, to gather by vieway upon an exclusive circuit. Sessions could be as private and enclosed as any. No system was absolutely impenetrable, however. An opponent might easily sneak a doppelganger into a top-level conference. If accepted, such a known person could glean great deal of information effortlessly and unnoticeably. Not just getting information out, but putting information in. A group of convincing surrogates could feed out opinions quite contrary to those held by their original twins. To sow drastic confusion. To achieve a dramatic coup.[44]

The sobering conclusion is that "the old idea that pictures do not lie is going to have to undergo drastic revision".


  1. ^ Brandon, John (2018-02-16). "Terrifying high-tech porn: Creepy 'deepfake' videos are on the rise". Fox News. Retrieved 2018-02-20.
  2. ^ Schwartz, Oscar (12 November 2018). "You thought fake news was bad? Deep fakes are where truth goes to die". The Guardian. Retrieved 14 November 2018.
  3. ^ "What Are Deepfakes & Why the Future of Porn is Terrifying". Highsnobiety. 2018-02-20. Retrieved 2018-02-20.
  4. ^ "Experts fear face swapping tech could start an international showdown". The Outline. Retrieved 2018-02-28.
  5. ^ Roose, Kevin (2018-03-04). "Here Come the Fake Videos, Too". The New York Times. ISSN 0362-4331. Retrieved 2018-03-24.
  6. ^ Bregler, Christoph; Covell, Michele; Slaney, Malcolm (1997). "Video Rewrite: Driving Visual Speech with Audio". Proceedings of the 24th Annual Conference on Computer Graphics and Interactive Techniques. 24: 353–360 – via ACM Digital Library.
  7. ^ Suwajanakorn, Supasorn; Seitz, Steven M.; Kemelmacher-Shlizerman, Ira (July 2017). "Synthesizing Obama: Learning Lip Sync from Audio". ACM Trans. Graph. 36.4: 95:1–95:13 – via ACM Digital Library.
  8. ^ Thies, Justus; Zollhöfer, Michael; Stamminger, Marc; Theobalt, Christian; Nießner, Matthias (June 2016). "Face2Face: Real-Time Face Capture and Reenactment of RGB Videos". 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE: 2387–2395. doi:10.1109/CVPR.2016.262. ISBN 9781467388511.
  9. ^ a b c Cole, Samantha (24 January 2018). "We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now". Vice. Retrieved 4 May 2019.
  10. ^ Haysom, Sam (31 January 2018). "People Are Using Face-Swapping Tech to Add Nicolas Cage to Random Movies and What Is 2018". Mashable. Retrieved 4 April 2019.
  11. ^ Cole, Samantha (11 December 2017). "AI-Assisted Fake Porn Is Here and We're All Fucked". Vice. Retrieved 19 December 2018.
  12. ^ a b Hathaway, Jay (8 February 2018). "Here's where 'deepfakes,' the new fake celebrity porn, went after the Reddit ban". The Daily Dot. Retrieved 22 December 2018.
  13. ^ "r/SFWdeepfakes". Reddit. Retrieved 12 December 2018.
  14. ^ Roettgers, Janko (2018-02-21). "Porn Producers Offer to Help Hollywood Take Down Deepfake Videos". Variety. Retrieved 2018-02-28.
  15. ^ "It took us less than 30 seconds to find banned 'deepfake' AI smut on the internet". Retrieved 2018-02-20.
  16. ^ Kharpal, Arjun (2018-02-08). "Reddit, Pornhub ban videos that use A.I. to superimpose a person's face over an X-rated actor". CNBC. Retrieved 2018-02-20.
  17. ^ "PornHub, Twitter Ban 'Deepfake' AI-Modified Porn". PCMAG. Retrieved 2018-02-20.
  18. ^ AI-Assisted Fake Porn Is Here and We’re All Fucked, Motherboard, 2017-12-11
  19. ^ a b Markus Böhm (2018-02-07), "Deepfakes": Firmen gehen gegen gefälschte Promi-Pornos vor, Spiegel Online
  20. ^
  21. ^ Call for upskirting bill to include 'deepfake' pornography ban The Guardian
  22. ^ Harrell, Drew. "Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'". The Washington Post. Retrieved 2019-01-01.
  23. ^ a b c "Wenn Merkel plötzlich Trumps Gesicht trägt: die gefährliche Manipulation von Bildern und Videos". az Aargauer Zeitung. 2018-02-03.
  24. ^ Patrick Gensing. "Deepfakes: Auf dem Weg in eine alternative Realität?".
  25. ^ Romano, Aja (April 18, 2018). "Jordan Peele's simulated Obama PSA is a double-edged warning against fake news". Vox. Retrieved September 10, 2018.
  26. ^ Swenson, Kyle (January 11, 2019). "A Seattle TV station aired doctored footage of Trump's Oval Office speech. The employee has been fired". The Washington Post. Retrieved January 11, 2019.
  27. ^ Britta Bauchmüller, "Fake-App": Mit diesem Programm kann jeder im Porno landen – ob er will oder nicht!,
  28. ^ Eike Kühl (2018-01-26), Künstliche Intelligenz: Auf Fake News folgt Fake Porn, Die Zeit, ISSN 0044-2070
  29. ^ heise online, Deepfakes: Neuronale Netzwerke erschaffen Fake-Porn und Hitler-Parodien
  30. ^ Farquhar, Peter (2018-08-27). "An AI program will soon be here to help your deepfake dancing – just don't call it deepfake". Business Insider Australia. Retrieved 2018-08-27.
  31. ^ "Deepfakes for dancing: you can now use AI to fake those dance moves you always wanted". The Verge. Retrieved 2018-08-27.
  32. ^
  33. ^
  34. ^
  35. ^
  36. ^ Künstliche Intelligenz: Selfies sind eine gute Quelle, Die Zeit, 2018-01-26, ISSN 0044-2070
  37. ^ „Deepfake“ – FakeApp kann Personen in Pornos austauschen – Welche Rechte haben Geschädigte?, WILDE BEUGER SOLMECKE Rechtsanwälte, 2018-02-02
  38. ^ "Pornhub hasn't been actively enforcing its deepfake ban". Engadget. Retrieved 2018-04-21.
  39. ^ "Pornhub Banned Deepfake Celebrity Sex Videos, But The Site Is Still Full Of Them". BuzzFeed. Retrieved 2018-04-21.
  40. ^ barbara.wimmer, Deepfakes: Reddit löscht Forum für künstlich generierte Fake-Pornos
  41. ^ heise online. "Deepfakes: Auch Reddit verbannt Fake-Porn".
  42. ^ "Reddit verbannt Deepfake-Pornos".
  43. ^ Washington Post. "Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'".
  44. ^ a b c d "Wodhams J. Picaper. Analog (Mid-December 1986)".

External links[edit]