Deepfake videos can be fun, but not when it comes to politcs and pornography. Now, the state of California is doing something about it with two new bills signed into law last week by Governor Gavin Newsom. The first makes it illegal to post any manipulated videos that could, for instance, replace a candidate's face or speech in order to discredit them, within 60 days of an election. The other will allow residents of the state to sue anyone who puts their image into a pornographic video using deepfake technology. Deepfake videos have become more convincing as of late, especially recent ones from Ctrl Shift Face that show comedian/actor Bill Hader's face replaced by Tom Cruise.
In November 2017, a Reddit account called deepfakes posted pornographic clips made with software that pasted the faces of Hollywood actresses over those of the real performers. Nearly two years later, deepfake is a generic noun for video manipulated or fabricated with artificial intelligence software. The technique has drawn laughs on YouTube, along with concern from lawmakers fearful of political disinformation. Yet a new report that tracked the deepfakes circulating online finds they mostly remain true to their salacious roots. Startup Deeptrace took a kind of deepfake census during June and July to inform its work on detection tools it hopes to sell to news organizations and online platforms.
Deepfakes, or face-swap videos, are video or images that use machine learning to create and manipulate visuals of people or events. The most famous example is the celebrity deepfake videos which are so realistic that viewers can't tell them apart from the real thing. Deepfake is a still relatively new technology that can create highly convincing videos of people saying or doing things they never did. This has many potential uses, from the creation of realistic celebrity videos to fake news. However, it is still very early in the life of this technology and a lot of people are worried about how it can be used for evil.
When several life-like Tom Cruise deepfakes went viral on TikTok, many saw the future of truth through a glass, darkly -- out of concern for a world where acquiring deepfakes of major celebrities or political figures would become a "one-click" feature of daily life. Like it or not, we live in a world where anyone can interact with deepfake technology. But curating high-end specialized AI drivers -- whether for mischief or raising awareness -- is harder than it looks. The creator of the video -- a Belgium VFX specialist named Chris Ume -- thinks this is unlikely, emphasizing the impractically long timespans and substantial effort required to build every deepfake, in addition to finding an ace Tom Cruise impersonator (Miles Fisher). "You can't do it by just pressing a button," said Ume in a report from The Verge.