Goto

Collaborating Authors

Deepfakes and the 2020 US elections: what (did not) happen

arXiv.org Artificial Intelligence

In retrospect, Nisos experts made the right forecast. However, this was a clear minority opinion. Before and after their report, dozens of politicians and institutions drew considerable attention to the approaching danger: 'imagine a scenario where, on the eve of next year's presidential election, the Democratic nominee appears in a video where he or she endorses President Trump. Now, imagine it the other way around.' (Sprangler, 2019). It is fair to say that deepfakes' high potential for disinformation was noticed long before these hypothetical consequences were evoked, mainly because they were revealed to be highly credible. Two examples: 'In an online quiz, 49 percent of people who visited our site said they incorrectly believed Nixon's synthetically altered face was real and 65 percent thought his voice was real' (Panetta et al, 2020), or'Two-thirds of participants believed that one day it would be impossible to discern a real video from a fake one.


Microsoft unveils new tools to identify deepfake videos

Daily Mail - Science & tech

Microsoft has launched a new tool to identify'deepfake' photos and videos that have been created to trick people into believing false information online. Deepfakes – also known as synthetic media – are photos, videos or audio files that have been manipulated using AI to show or say something that isn't real. There were at least 96 'foreign influenced' deep fake campaigns on social media targeting people in 30 countries between 2013 and 2019, according to Microsoft. To combat campaigns using this manipulated form of media, the tech giant has launched a new'Video Authenticator' tool that can analyse a still photo or video and provide a percentage chance that the media source has been manipulated. It works by detecting the blending boundary of the deepfake, and subtle fading or greyscale elements that might not be detectable by the human eye.


Where Are The Deepfakes In This Presidential Election?

NPR Technology

So far, few deepfakes have been used this political season. It's not because they aren't a potential threat, but because simpler deceptive tactics are still effective at spreading misinformation. So far, few deepfakes have been used this political season. It's not because they aren't a potential threat, but because simpler deceptive tactics are still effective at spreading misinformation. Despite people's fears, sophisticated, deceptive videos known as "deepfakes" haven't arrived this political season.


Artificial Intelligence to Weaponize Fake Videos

#artificialintelligence

Deception operations using high-quality fake videos produced with artificial intelligence are the next phase of information warfare operations by nation states aimed at subverting American democracy. Currently, "deepfakes," or human image-synthesized videos, mainly involve the use of celebrity likenesses and voices superimposed on women in porn videos. But the weaponization of deepfakes for political smear campaigns, in commercial operations to discredit businesses, or subversion by foreign intelligence services in disinformation operations is a looming threat. "I believe this is the next wave of attacks against America and Western democracies," said Sen. Marco Rubio (R., Fla.), a member of the Senate Select Committee on Intelligence. Rubio is pushing the U.S. intelligence community to address the danger of deepfake disinformation campaigns from nation states or terrorists before the threat fully emerges.


Deepfakes may not have upended the 2020 U.S. election, but their day is coming

#artificialintelligence

Many projected that deepfake videos would play a lead role in the 2020 elections, with the prospect of foreign interference and disinformation campaigns looming large in the leadup to election day. Yet, if there has been a surprise in campaign tactics this cycle, it is that these AI-generated videos have played a very minor role, little more than a cameo (so far, at least). Deepfake videos are much more convincing today due to giant leaps in the field of Generative Adversarial Networks. These are generated videos that are doctored to alter reality, showing events or depicting speech that never happened. Because people tend to lend substantial credence to what they see and hear, deepfakes pose a very real danger.