Deepfakes, the word for videos where the performers’ faces get swapped out with celebrity faces, are mostly used to create computer-generated fake celebrity porn, but they’re terrifying because the technology can also be used to create convincing fake videos of celebs and politicians doing things they’ve never actually done in real life. Thankfully, Twitter and Reddit have officially banned deepfakes from their social media platforms, but a deepfakes ban assumes that websites will always be able to detect such videos and stop them before they have real-world consequences.
In Twitter’s statement about their deepfakes ban, they said posting such videos violates a company policy which states, “You may not post or share intimate photos or videos of someone that were produced or distributed without their consent.”
A Twitter spokesperson added, “We will suspend any account we identify as the original poster of intimate media that has been produced or distributed without the subject’s consent. We will also suspend any account dedicated to posting this type of content.”
The news site Mashable noted that the online chat service Discord, the GIF site Gyfcat and the porn tube site Pornhub have also agreed to remove and shut down any deepfakes discovered on their sites.
Similarly, the link-sharing and discussion board site Reddit recently banned several sub-reddit discussion boards dedicated to deepfakes including r/deepfaker, r/deepfakes, r/CelebFakes and r/YouTubeFakes, citing a newly developed policy against “involuntary pornography.”
However, Vice reports that despite Reddit’s deepfakes ban, several discussion boards dedicated to non-porn deepfakes remain up including r/FakeApp, r/SFWdeepfakes, and r/videofakes are still up.
Furthermore, many other websites and platforms will happily host deepfakes elsewhere, which raises two questions: First, as deepfake technology improves, how will social media platforms be able to tell deepfake porn videos from real celebrity sex tapes? Secondly, when will these sites take similar action against non-porn deepfake videos (and GIFs) depicting public figures doing and saying things they never actually did?
Right now, fake news sites dodge banning from social media platforms by discretely listing themselves as “satire” or by citing unnamed “sources” in their fake news stories as to avoid ever being proven as deliberately misleading.
But while a video of Trump’s face swapped onto Hillary’s head (above) may seem hilarious and mostly harmless, deepfake videos like this one of Barack Obama discussing the Pulse nightclub shooting, show how easily the technology can create convincing fakes that could easily go viral before they’re ever detected.