When you're in the metaverse, you are generally represented by either a blocky or cartoonish avatar or a disembodied floating torso and a pair of hands. None of which looks remotely like you. But what happens when things become much more real? A number of companies are developing ways for you to create hyper-realistic representations of yourself for the metaverse, with your face, your voice and even the way you move. One of these is Metaphysic, a deepfake or synthetic media company, founded by Chris Ume, creator of the Deep Tom Cruise videos that took TikTok by storm last year.
On TikTok, you might have seen Tom Cruise playing acoustic guitar in a plain white t-shirt and a green baseball cap. You might have seen Tom Cruise check himself out shirtless in a bathroom mirror. All of these Tom Cruise appearances were deepfakes, computer-generated videos that transplant a person's face, voice, and overall likeness onto another body (in this case, actor Miles Fisher). Almost everything about deepfakes is controversial. The term, a mishmash of "deep learning" and "fake"), originates from a Reddit community in 2017 that retrofitted pornographic videos with celebrities' faces on them, causing an ethical row around the technology.
Our team is made up of some of the world's leading AI artists and synthetic media creators. We love to create fun content and aim to delight audiences around the world with digital experiences that offer a glimpse of a hyperreal future. We are our own best customers and beta testers for our products. But we also believe it is imperative to create content that is used to educate viewers and raise awareness about the underlying technologies in order to disrupt the negative impact of unethical uses of synthetic media. Earlier this year, we released @deeptomcruise, a series of parody videos on TikTok that quickly garnered more than 100 million views and coverage from hundreds of media outlets around the world.
Earlier this year, videos of Tom Cruise started popping up on TikTok of the actor doing some surprisingly un-Tom-Cruise-like stuff: goofing around in an upscale men's clothing store; showing off a coin trick; growling playfully during a short rendition of Dave Matthews Band's "Crash Into Me." In one video, he bites into a lollipop and is amazed to find gum in the center. "Mmmmm," he says to the camera. How come nobody ever told me there's bubblegum? The 10 videos, which were posted between February and June, featured an artificial intelligence-generated doppelganger meant to look and sound like him.
Adobe believes that the metaverse is going to blur the distinctions between the digital world and the physical world, and it wants to provide the tools to enable that. That's one of the main messages of the Adobe Summit 2022 that kicks off today. The Adobe Summit 2022 is fully virtual and powered by Adobe Experience Cloud, which is now used by 75% of Fortune 100 companies. "Increasingly, we're using the digital world to do things that we once only did in the physical world. The ongoing conversation on the metaverse reflects the fact that the distinction between what people do in the physical and virtual world is blurring," said Shantanu Narayen, CEO of Adobe, in a statement.