Alethea AI, a synthetic media company, is piloting âprivacy-preserving face skins,â or digital masks that counter facial recognition algorithms and help users preserve privacy on pre-recorded videos.Â The move comes as companies such as IBM, Microsoft, and Amazon announced they would suspend the sale of their facial recognition technology to law enforcement agencies.Â âThis is a new technique we developed inhouse that wraps a face with our AI algorithms,â said Alethea AI CEO Arif Khan. âAvatars are fun to play with and develop, but these âmasks/skinsâ are a different, more potent, animal to preserve privacy.â Related: Why CoinDesk Respects Pseudonymity: A Stand Against Doxxing See also: Human Rights Foundation Funds Bitcoin Privacy Tools Despite âCoin Mixingâ Legal Stigma The Los Angeles based startup launched in 2019 with a focus on creating avatars for content creators that the creators could license out for revenue. The idea comes as deepfakes, or manipulated media that can make someone appear as if they are doing or saying anything, becomes more accessible and widespread. According to a 2019 report from Deep Trace, a company which detects and monitors deepfakes, there were over 14,000 deepfakes online in 2019 and over 850 people were targeted by them. Alethea AI wants to let creators use their own synthetic media avatars for marketing purposes, in a sense trying to let people leverage deepfakes of themselves for money.Â Khan compares the proliferation of facial recognition data now to the Napster-style explosion in music piracy in the early 2000s. Companies, like Clearview AI, have already harvested large amounts of data from people for facial recognition algorithms, then resold this dataÂ to security services without consent, and with all the bias inherent in facial recognition algorithms, which are generally less accurate on women and people of color.Â Related: The Zcash Privacy Tech Underlying Ethereumâs Transition to Eth 2.0 Clearview AI, has marketed itself to law enforcement and scraped billions of images from websites like Facebook, Youtube, and Venmo. The company is currently being sued for doing so.Â Â âWe will get to a point where there needs to be an iTunes sort of layer, where your face and voice data somehow gets protected,â said Khan.Â One part of that is creators licensing out their likeness for a fee. Crypto entrepreneur Alex Masmej was the first such avatar, and for $99 you can hire the avatar to say 200 words of whatever you want, provided the real Masmej approves the text.Â We will get to a point whereâ¦ where your face and voice data somehow gets protected Alethea AI has also partnered with software firm Oasis Labs, so that all content generated for Alethea AIâs synthetic media marketplace will be verified using Oasis Labâs secure blockchain, akin to Twitterâs âverifiedâ blue check mark.Â âThere are a lot of Black Mirror scenarios when we think of deepfakes but if my personal approval is needed for my deepfakes and itâs then time-stamped on a public blockchain for anyone to verify the videos that I actually want to release, that provides a protection that deepfakes are currently lacking,â said Masmej.Â The privacy pilot takes this idea one step further, not only creating a deep fake license out, but preventing companies or anyone from grabbing your facial data from a recording.Â There are two parts to the privacy component. The first, currently being piloted, involves pre-recorded videos. Users upload a video, identify where and what face skin they would like superimposed on their own, and then Alethea AIâs algorithms map the key points on your own face, and wrap the mask around this key point map that is created. The video is then sent back to a client.Â See also: Fake News on Steroids: Deepfakes Are Coming â Are World Leaders Prepared? Alethea AI also wants to enable face masking during real time communications, such as over a Zoom call. But Khan says computing power doesnât quite allow that yet, though it should be possible in a year, he hopes.Â Alethea AI piloted one example of the tech with Crypto AI Profit, a blockchain and AI influencer, who used it during a Youtube video.Â Deepfakes, voice spoofing, and other tech enabled mimicry seem here to stay, but Khan is still optimistic that weâre not yet at the point of no return when it comes to protecting ourselves.Â âIâm hopeful that the individual is accorded some sort of framework in this entire emerging landscape,â said Khan. âItâs going to be a very interesting ride. I donât think the battle is fully decided, although existing systems are oriented towards preserving larger, more corporate input.â Related Stories JD.com Subsidiary Rolling Out Privacy Tech From Blockchain Firm ARPA From Australia to Norway, Contact Tracing Is Struggling to Meet Expectations
Move over Face ID, the Australian government has eclipsed you on the creepy factor. It'll allow for photos from government I.D.s and licenses to be added to a national facial recognition database, making it easier for the country's law enforcement agencies to identify people in real time. SEE ALSO: Moscow's facial recognition CCTV network is the biggest example of surveillance society yet The announcement was made on Thursday by Prime Minister Malcolm Turnbull, following an agreement between all the country's states and territories. It will be up and running next year, and the government says the database will help bolster national security. "To be quite clear about this, this is not accessing information, photo I.D. information that is not currently available.