Can AI Detect Deepfakes To Help Ensure Integrity of U.S. 2020 Elections?

IEEE Spectrum Robotics

A perfect storm arising from the world of pornography may threaten the U.S. elections in 2020 with disruptive political scandals having nothing to do with actual affairs. Instead, face-swapping "deepfake" technology that first became popular on porn websites could eventually generate convincing fake videos of politicians saying or doing things that never happened in real life--a scenario that could sow widespread chaos if such videos are not flagged and debunked in time. The thankless task of debunking fake images and videos online has generally fallen upon news reporters, fact-checking websites and some sharp-eyed good Samaritans. But the more recent rise of AI-driven deepfakes that can turn Hollywood celebrities and politicians into digital puppets may require additional fact-checking help from AI-driven detection technologies. An Amsterdam-based startup called Deeptrace Labs aims to become one of the go-to shops for such deepfake detection technologies.


Deepfakes, Revenge Porn, And The Impact On Women

#artificialintelligence

Imagine seeing yourself in a sexually explicit video in which you have never participated. This is a distinct possibility today for a female celebrity or a regular woman living in the age of Deepfakes. Deepfake is a technique used to manipulate human images based on artificial intelligence. What sets Deepfake images or videos apart from other modified images is that the former looks strikingly authentic. Earlier this year, a Deepfake video that went viral showed comedian Bill Hader's face being seamlessly transformed into that of Tom Cruise.


'A definite threat': The fake video phenomenon taking over the internet

#artificialintelligence

You might not be aware of it, but there's a quiet arms race going on over our collective reality. The fight is between those who want to subvert it and usher in a world where we no longer believe what we see on our screens and those who want to help preserve the status quo. Up until this point in time, we have largely trusted our eyes and ears when consuming audio and visual media content, but new technological systems that create something known as deepfakes, are changing that. And as these deepfake videos nudge into the mainstream, experts are increasingly worried about the ramifications it will have on the information sharing that underpins society. Dr Richard Nock is the head of machine learning at CSIRO's Data 61 and understands the daunting potential of the technology that powers deepfake videos.


The number of deepfake videos online is spiking. Most are porn

#artificialintelligence

San Francisco (CNN)Deepfake videos are quickly becoming a problem, but there has been much debate about just how big the problem really is. One company is now trying to put a number on it. There are at least 14,678 deepfake videos -- and counting -- on the internet, according to a recent tally by a startup that builds technology to spot this kind of AI-manipulated content. And nearly all of them are porn. The number of deepfake videos is 84% higher than it was last December when Amsterdam-based Deeptrace found 7,964 deepfake videos during its first online count.


Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'

Washington Post - Technology News

The video showed the woman in a pink off-the-shoulder top, sitting on a bed, smiling a convincing smile. But it had been seamlessly grafted, without her knowledge or consent, onto someone else's body: a young pornography actress, just beginning to disrobe for the start of a graphic sex scene. A crowd of unknown users had been passing it around online. She felt nauseous and mortified: What if her co-workers saw it? Would it change how they thought of her? Would they believe it was a fake?