Deepfakes, or face-swap videos, are video or images that use machine learning to create and manipulate visuals of people or events. The most famous example is the celebrity deepfake videos which are so realistic that viewers can't tell them apart from the real thing. Deepfake is a still relatively new technology that can create highly convincing videos of people saying or doing things they never did. This has many potential uses, from the creation of realistic celebrity videos to fake news. However, it is still very early in the life of this technology and a lot of people are worried about how it can be used for evil.
The furor around deepfakes, porn videos that use machine learning to convincingly edit celebrities into sex scenes, has largely died down since many hosting sites banned the clips months ago. But deepfakes are still out there, even on sites where they're not technically allowed. Popular streaming site PornHub, which classifies deepfakes as nonconsensual and theoretically doesn't permit them, still hosts dozens of the videos. BuzzFeed's Charlie Warzel wrote on Wednesday that he'd found more than 100 deepfake videos on PornHub, and they weren't particularly well-hidden. Searches like "deepfake" and "fake deeps" brought up dozens of clips.
The social media giant is putting $10 million into the "Deepfake Detection Challenge," which aims to spur detection research. As part of the project, Facebook is commissioning researchers to produce realistic deepfakes to create a data set for testing detection tools. The company said the videos, which will be released in December, will feature paid actors and that no user data would be utilized. In the run-up to the U.S. presidential election in November 2020, social platforms have been under pressure to tackle the threat of deepfakes, which use artificial intelligence to create hyper-realistic videos where a person appears to say or do something they did not. While there has not been a well-crafted deepfake video with major political consequences in the United States, the potential for manipulated video to cause turmoil was recently demonstrated by a "cheapfake" clip of House Speaker Nancy Pelosi, manually slowed down to make her speech seem slurred.
Deepfake videos can be fun, but not when it comes to politcs and pornography. Now, the state of California is doing something about it with two new bills signed into law last week by Governor Gavin Newsom. The first makes it illegal to post any manipulated videos that could, for instance, replace a candidate's face or speech in order to discredit them, within 60 days of an election. The other will allow residents of the state to sue anyone who puts their image into a pornographic video using deepfake technology. Deepfake videos have become more convincing as of late, especially recent ones from Ctrl Shift Face that show comedian/actor Bill Hader's face replaced by Tom Cruise.
When several life-like Tom Cruise deepfakes went viral on TikTok, many saw the future of truth through a glass, darkly -- out of concern for a world where acquiring deepfakes of major celebrities or political figures would become a "one-click" feature of daily life. Like it or not, we live in a world where anyone can interact with deepfake technology. But curating high-end specialized AI drivers -- whether for mischief or raising awareness -- is harder than it looks. The creator of the video -- a Belgium VFX specialist named Chris Ume -- thinks this is unlikely, emphasizing the impractically long timespans and substantial effort required to build every deepfake, in addition to finding an ace Tom Cruise impersonator (Miles Fisher). "You can't do it by just pressing a button," said Ume in a report from The Verge.