The Cyberspace Administration of China (CAC) announced on Friday that it is making it illegal for fake news to be created with deepfake video and audio, according to Reuters. "Deepfakes" are video or audio content that have been manipulated using AI to make it look like someone said or did something they have never done. In its statement, the CAC said "With the adoption of new technologies, such as deepfake, in online video and audio industries, there have been risks in using such content to disrupt social order and violate people's interests, creating political risks and bringing a negative impact to national security and social stability," according to the South China Morning Post reporting on the new regulations. The CAC's regulations, which go into effect on January 1, 2020, require publishers of deepfake content to disclose that a piece of content is, indeed, a deepfake. It also requires content providers to detect deepfake content themselves, according to the South China Morning Post.
In a world where your online identity links to you directly, the prospect of perfect replication is worrying. But that's exactly what we face with the advent of deepfake technology. As the technology becomes cheaper and easier to use, what are the dangers of deepfakes? Furthermore, how can you spot a deepfake versus the real deal? A deepfake is the name given to media where a person in the video or image is replaced with someone else's likeness.
In early 2018 a video that appeared to feature former President Obama discussing the dangers of fake news went viral. The clip, created by comedian Jordan Peele, foreshadowed challenges that have now become all too real. These days, tech firms, media companies and consumers are all routinely forced to make determinations about whether content is authentic or fake -- and it's increasingly hard to tell the difference. Deepfakes are videos and images that have been digitally manipulated to depict people saying and doing things that never happened. Most deepfakes use artificial intelligence to alter video and to generate authentic-sounding audio.