The US military is funding an effort to catch deepfakes and other AI trickery

MIT Technology Review 

Think that AI will help put a stop to fake news? The Department of Defense is funding a project that will try to determine whether the increasingly real-looking fake video and audio generated by artificial intelligence might soon be impossible to distinguish from the real thing--even for another AI system. This summer, under a project funded by the Defense Advanced Research Projects Agency (DARPA), the world's leading digital forensics experts will gather for an AI fakery contest. They will compete to generate the most convincing AI-generated fake video, imagery, and audio--and they will also try to develop tools that can catch these counterfeits automatically. The contest will include so-called "deepfakes," videos in which one person's face is stitched onto another person's body. Rather predictably, the technology has already been used to generate a number of counterfeit celebrity porn videos.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found