The number of deepfake videos online is spiking. Most are porn


San Francisco (CNN)Deepfake videos are quickly becoming a problem, but there has been much debate about just how big the problem really is. One company is now trying to put a number on it. There are at least 14,678 deepfake videos -- and counting -- on the internet, according to a recent tally by a startup that builds technology to spot this kind of AI-manipulated content. And nearly all of them are porn. The number of deepfake videos is 84% higher than it was last December when Amsterdam-based Deeptrace found 7,964 deepfake videos during its first online count.

Can AI Detect Deepfakes To Help Ensure Integrity of U.S. 2020 Elections?

IEEE Spectrum Robotics

A perfect storm arising from the world of pornography may threaten the U.S. elections in 2020 with disruptive political scandals having nothing to do with actual affairs. Instead, face-swapping "deepfake" technology that first became popular on porn websites could eventually generate convincing fake videos of politicians saying or doing things that never happened in real life--a scenario that could sow widespread chaos if such videos are not flagged and debunked in time. The thankless task of debunking fake images and videos online has generally fallen upon news reporters, fact-checking websites and some sharp-eyed good Samaritans. But the more recent rise of AI-driven deepfakes that can turn Hollywood celebrities and politicians into digital puppets may require additional fact-checking help from AI-driven detection technologies. An Amsterdam-based startup called Deeptrace Labs aims to become one of the go-to shops for such deepfake detection technologies.

Deepfakes, Revenge Porn, And The Impact On Women


Imagine seeing yourself in a sexually explicit video in which you have never participated. This is a distinct possibility today for a female celebrity or a regular woman living in the age of Deepfakes. Deepfake is a technique used to manipulate human images based on artificial intelligence. What sets Deepfake images or videos apart from other modified images is that the former looks strikingly authentic. Earlier this year, a Deepfake video that went viral showed comedian Bill Hader's face being seamlessly transformed into that of Tom Cruise.

Even the AI Behind Deepfakes Can't Save Us From Being Duped


Last week Google released several thousand deepfake videos to help researchers build tools that use artificial intelligence to spot altered videos that could spawn political misinformation, corporate sabotage, or cyberbullying. Google's videos could be used to create technology that offers hope of catching deepfakes in much the way spam filters catch email spam. In reality, though, technology will only be part of the solution. That's because deepfakes will most likely improve faster than detection methods, and because human intelligence and expertise will be needed to identify deceptive videos for the foreseeable future. Deepfakes have captured the imagination of politicians, the media, and the public.

Most Deepfakes Are Porn, and They're Multiplying Fast


In November 2017, a Reddit account called deepfakes posted pornographic clips made with software that pasted the faces of Hollywood actresses over those of the real performers. Nearly two years later, deepfake is a generic noun for video manipulated or fabricated with artificial intelligence software. The technique has drawn laughs on YouTube, along with concern from lawmakers fearful of political disinformation. Yet a new report that tracked the deepfakes circulating online finds they mostly remain true to their salacious roots. Startup Deeptrace took a kind of deepfake census during June and July to inform its work on detection tools it hopes to sell to news organizations and online platforms.