A perfect storm arising from the world of pornography may threaten the U.S. elections in 2020 with disruptive political scandals having nothing to do with actual affairs. Instead, face-swapping "deepfake" technology that first became popular on porn websites could eventually generate convincing fake videos of politicians saying or doing things that never happened in real life--a scenario that could sow widespread chaos if such videos are not flagged and debunked in time. The thankless task of debunking fake images and videos online has generally fallen upon news reporters, fact-checking websites and some sharp-eyed good Samaritans. But the more recent rise of AI-driven deepfakes that can turn Hollywood celebrities and politicians into digital puppets may require additional fact-checking help from AI-driven detection technologies. An Amsterdam-based startup called Deeptrace Labs aims to become one of the go-to shops for such deepfake detection technologies.
In a world where your online identity links to you directly, the prospect of perfect replication is worrying. But that's exactly what we face with the advent of deepfake technology. As the technology becomes cheaper and easier to use, what are the dangers of deepfakes? Furthermore, how can you spot a deepfake versus the real deal? A deepfake is the name given to media where a person in the video or image is replaced with someone else's likeness.
"You can already see a material effect that deepfakes have had," said Nick Dufour, one of the Google engineers overseeing the company's deepfake research. "They have allowed people to claim that video evidence that would otherwise be very convincing is a fake." For decades, computer software has allowed people to manipulate photos and videos or create fake images from scratch. But it has been a slow, painstaking process usually reserved for experts trained in the vagaries of software like Adobe Photoshop or After Effects. Now, artificial intelligence technologies are streamlining the process, reducing the cost, time and skill needed to doctor digital images.
We've acquired Dessa, a Toronto-based company building machine learning applications that address significant real-world challenges for all types of businesses. Their team of world-class engineers will immediately bolster our machine learning and artificial intelligence capabilities at Square. Machine learning is a critical field in technology today, and we've expanded our machine learning work at Square over time through both in-house development and acquisitions like Eloquent Labs. The acquisition of Dessa will help us further boost our machine learning abilities, improve our products, and ultimately pass on the benefits to our customers around the world. For example, machine learning technology can help us enhance products in areas like customer engagement, risk management, and more.
As coverage of deepfake technology becomes more prevalent, it's reasonable to wonder how these videos even work. Advancements in motion capturing and facial recognition over the past decade have been staggering – and terrifying. What used to be limited to only the most well-funded computer scientists and movie studios is now a tool in the hands of comedy outlets and state-run media. By definition, deepfakes are videos in which a person's face and/or voice are replaced with someone else's by an AI. The underlying technology and machine learning processes rose to popularity in the early '90s, evolving from a field of academic research.