Collaborating Authors

The Bizarre and Terrifying Case of the "Deepfake" Video that Helped Bring an African Nation to the Brink

Mother Jones

This fall, Gabon was facing an odd and tenuous political situation. President Ali Bongo had been out of the country since October receiving medical treatment in Saudi Arabia and London and had not been seen in public. People in Gabon and observers outside the country were growing suspicious about the president's well being, and the government's lack of answers only fueled doubts; some even said he was dead. After months of little information, on December 9th, the country's vice president announced that Bongo had suffered a stroke in the autumn, but remained in good shape. Despite such assurances, civil society groups and many members of the public wondered why Bongo, if he was well, had not made any public appearances, save for a few pictures of him released by the government along with a silent video.

Can AI Detect Deepfakes To Help Ensure Integrity of U.S. 2020 Elections?

IEEE Spectrum Robotics

A perfect storm arising from the world of pornography may threaten the U.S. elections in 2020 with disruptive political scandals having nothing to do with actual affairs. Instead, face-swapping "deepfake" technology that first became popular on porn websites could eventually generate convincing fake videos of politicians saying or doing things that never happened in real life--a scenario that could sow widespread chaos if such videos are not flagged and debunked in time. The thankless task of debunking fake images and videos online has generally fallen upon news reporters, fact-checking websites and some sharp-eyed good Samaritans. But the more recent rise of AI-driven deepfakes that can turn Hollywood celebrities and politicians into digital puppets may require additional fact-checking help from AI-driven detection technologies. An Amsterdam-based startup called Deeptrace Labs aims to become one of the go-to shops for such deepfake detection technologies.

Deepfakes, Revenge Porn, And The Impact On Women


Imagine seeing yourself in a sexually explicit video in which you have never participated. This is a distinct possibility today for a female celebrity or a regular woman living in the age of Deepfakes. Deepfake is a technique used to manipulate human images based on artificial intelligence. What sets Deepfake images or videos apart from other modified images is that the former looks strikingly authentic. Earlier this year, a Deepfake video that went viral showed comedian Bill Hader's face being seamlessly transformed into that of Tom Cruise.

What are deepfakes? AI that deceives


The original example of a deepfake (by reddit user /u/deepfake) swapped the face of an actress onto the body of a porn performer in a video – which was, of course, completely unethical, although not initially illegal. Other deepfakes have changed what famous people were saying, or the language they were speaking. Deepfakes extend the idea of video (or movie) compositing, which has been done for decades. Significant video skills, time, and equipment go into video compositing; video deepfakes require much less skill, time (assuming you have GPUs), and equipment, although they are often unconvincing to careful observers. Originally, deepfakes relied on autoencoders, a type of unsupervised neural network, and many still do.

What are deepfakes – and how can you spot them?


Have you seen Barack Obama call Donald Trump a "complete dipshit", or Mark Zuckerberg brag about having "total control of billions of people's stolen data", or witnessed Jon Snow's moving apology for the dismal ending to Game of Thrones? Answer yes and you've seen a deepfake. The 21st century's answer to Photoshopping, deepfakes use a form of artificial intelligence called deep learning to make images of fake events, hence the name deepfake. Want to put new words in a politician's mouth, star in your favourite movie, or dance like a pro? Then it's time to make a deepfake.