Deepfake videos can be fun, but not when it comes to politcs and pornography. Now, the state of California is doing something about it with two new bills signed into law last week by Governor Gavin Newsom. The first makes it illegal to post any manipulated videos that could, for instance, replace a candidate's face or speech in order to discredit them, within 60 days of an election. The other will allow residents of the state to sue anyone who puts their image into a pornographic video using deepfake technology. Deepfake videos have become more convincing as of late, especially recent ones from Ctrl Shift Face that show comedian/actor Bill Hader's face replaced by Tom Cruise.
Remember how Pornhub said it was banning AI-generated deepfake videos? BuzzFeed News "easily" found over 100 of the non-consensual videos by searching for obvious keywords like "deepfake" and "fake deep," nearly all of them explicitly mentioning their deepfake status in the title or the uploader's username. The clips had been around for a while, too, as they had hundreds of thousands or even millions of views. Some videos surfaced in home page recommendations. The site had removed some of the videos after BuzzFeed got in touch, but others were still listed (again, with hundreds of thousands of views) after searching for other keywords.
Deepfake videos are hard for untrained eyes to detect because they can be quite realistic. Whether used as personal weapons of revenge, to manipulate financial markets or to destabilize international relations, videos depicting people doing and saying things they never did or said are a fundamental threat to the longstanding idea that "seeing is believing." Most deepfakes are made by showing a computer algorithm many images of a person, and then having it use what it saw to generate new face images. At the same time, their voice is synthesized, so it both looks and sounds like the person has said something new. Some of my research group's earlier work allowed us to detect deepfake videos that did not include a person's normal amount of eye blinking – but the latest generation of deepfakes has adapted, so our research has continued to advance.
The furor around deepfakes, porn videos that use machine learning to convincingly edit celebrities into sex scenes, has largely died down since many hosting sites banned the clips months ago. But deepfakes are still out there, even on sites where they're not technically allowed. Popular streaming site PornHub, which classifies deepfakes as nonconsensual and theoretically doesn't permit them, still hosts dozens of the videos. BuzzFeed's Charlie Warzel wrote on Wednesday that he'd found more than 100 deepfake videos on PornHub, and they weren't particularly well-hidden. Searches like "deepfake" and "fake deeps" brought up dozens of clips.
For weeks, computer scientist Siwei Lyu had watched his team's deepfake videos with a gnawing sense of unease. Created by a machine learning algorithm, these falsified films showed celebrities doing things they'd never done. They felt eerie to him, and not just because he knew they'd been ginned up. "They don't look right," he recalls thinking, "but it's very hard to pinpoint where that feeling comes from." He, like many kids, had held staring contests with his open-eyed peers.