A perfect storm arising from the world of pornography may threaten the U.S. elections in 2020 with disruptive political scandals having nothing to do with actual affairs. Instead, face-swapping "deepfake" technology that first became popular on porn websites could eventually generate convincing fake videos of politicians saying or doing things that never happened in real life--a scenario that could sow widespread chaos if such videos are not flagged and debunked in time. The thankless task of debunking fake images and videos online has generally fallen upon news reporters, fact-checking websites and some sharp-eyed good Samaritans. But the more recent rise of AI-driven deepfakes that can turn Hollywood celebrities and politicians into digital puppets may require additional fact-checking help from AI-driven detection technologies. An Amsterdam-based startup called Deeptrace Labs aims to become one of the go-to shops for such deepfake detection technologies.
"You can already see a material effect that deepfakes have had," said Nick Dufour, one of the Google engineers overseeing the company's deepfake research. "They have allowed people to claim that video evidence that would otherwise be very convincing is a fake." For decades, computer software has allowed people to manipulate photos and videos or create fake images from scratch. But it has been a slow, painstaking process usually reserved for experts trained in the vagaries of software like Adobe Photoshop or After Effects. Now, artificial intelligence technologies are streamlining the process, reducing the cost, time and skill needed to doctor digital images.
In a world where your online identity links to you directly, the prospect of perfect replication is worrying. But that's exactly what we face with the advent of deepfake technology. As the technology becomes cheaper and easier to use, what are the dangers of deepfakes? Furthermore, how can you spot a deepfake versus the real deal? A deepfake is the name given to media where a person in the video or image is replaced with someone else's likeness.
The social media goliath has created a model that can tell whether a video is using deepfake. Not just that, it can also determine the algorithm used to make that video. While the idea sounds interesting, will it be able to keep deepfakes at bay? Before we delve into the details let's understand deepfakes first. Have you ever come across a video where a highly influential person is speaking something out of the ordinary? If yes, then you might already have seen a deepfake video. Well, those are all deepfakes.
We've acquired Dessa, a Toronto-based company building machine learning applications that address significant real-world challenges for all types of businesses. Their team of world-class engineers will immediately bolster our machine learning and artificial intelligence capabilities at Square. Machine learning is a critical field in technology today, and we've expanded our machine learning work at Square over time through both in-house development and acquisitions like Eloquent Labs. The acquisition of Dessa will help us further boost our machine learning abilities, improve our products, and ultimately pass on the benefits to our customers around the world. For example, machine learning technology can help us enhance products in areas like customer engagement, risk management, and more.