Reddit on Wednesday banned “deepfake” pornography — fake celebrity porn videos created with face-swapping technologies — and shut down the community of the same name, becoming the third major Internet platform this week to crack down on the increasingly popular clips.
The site also updated its rules to prohibit sexually explicit photos and images “that have been faked.”
The move comes just a day after Pornhub prohibited deepfakes, saying the videos amount to nonconsensual content or revenge porn that violates the porn mega-site’s terms of service. Twitter followed suit on Tuesday, vowing to suspend any accounts posting such content. The gaming site Discord and the GIF creator Gfycat banned deepfakes in January.
It doesn’t take a sophisticated production studio to produce face-swap porn, which is a major reason deepfakes have proliferated online in recent months.
Deepfake makers create their videos using a patchwork of readily available technologies, as Vice’s Motherboard has explained in detail. Often, they use open-source social media tools to download photos of victims en masse. Once they have enough face pictures of the targeted celebrity to work with — it typically takes a few hundred — they look for a suitable porn performer’s body to graft them onto. In this sense, performers are victimized, too.
Some deepfake makers have experimented with browser-based applications that supposedly use facial recognition software to find the best face-body matches, according to Motherboard. When the data is fed to a machine learning algorithm, available online free, the resulting fake porn videos can be disturbingly convincing.
The practice of creating deepfakes was popularized last year by an anonymous Reddit user called deepfakes who posted fake hardcore porn videos featuring the faces of Gal Gadot, Taylor Swift and other celebrities. The account garnered some 80,000 subscribers in a matter of months.
In late January, Motherboard reported that another Reddit user named deepfakeapp had created FakeApp, an application that put deepfake technology into a user-friendly package, allowing people who lack the technical chops to create face-swap porn with relative ease. User-generated videos of celebrities quickly began to crop up on the deepfakes subreddit, leading to this week’s spate of bans.
On Wednesday, deepfakeapp posted that he had zero tolerance for nonconsensual pornography and threatened to bar people who posted such material from the FakeApp subreddit.
“FakeApp was created to give everyday people access to realistic faceswapping technology for creative and exploratory use,” the post read. “It was certainly not created to enable the generation of nonconsensual pornography, and we have never condoned its use for that purpose.”
Deepfakeapp also posted a tutorial on how to make face-swap GIFs, noting, “If you use this tool, please use it responsibly.”
The vast majority of deepfakes available online feature celebrities. But the implications for private individuals are the same. In a few clicks, all the photos in a person’s Instagram account could be downloaded, fed to the right software and turned into fake porn — or something non-pornographic but equally embarrassing or incriminating.
To be sure, there are harmless uses for this. Similar technology was used to recreate a young Princess Leia in “Rogue One.” And of course there’s the popular albeit controversial suite of face-modifying apps for Snapchat.
But as Wired noted recently, U.S. law offers little recourse to victims of face-swap porn, largely because courts thus far have viewed such material in the same realm as parody or satire, which enjoy strong First Amendment protections.
“It falls through the cracks because it’s all very betwixt and between,” Danielle Citron, a law professor at the University of Maryland and cyberspace expert, told Wired. “There are all sorts of First Amendment problems because it’s not their real body.”