Goto

Collaborating Authors

Cloak your photos with this AI privacy tool to fool facial recognition

#artificialintelligence

Ubiquitous facial recognition is a serious threat to privacy. The idea that the photos we share are being collected by companies to train algorithms that are sold commercially is worrying. Anyone can buy these tools, snap a photo of a stranger, and find out who they are in seconds. But researchers have come up with a clever way to help combat this problem. The solution is a tool named Fawkes, and was created by scientists at the University of Chicago's Sand Lab.


Fawkes protects your identity from facial recognition systems, pixel by pixel

ZDNet

A new tool has been proposed for cloaking our true identities when photos are posted online to prevent profiling through facial recognition systems. Deep learning tools and facial recognition software has now permeated our daily lives. From surveillance cameras equipped with facial trackers to photo-tagging suggestions on social media, the use of these technologies is now common -- and often controversial. A number of US states and the EU are considering the ban of facial recognition cameras in public spaces. IBM has already exited the business, on the basis that the technology could end up enforcing racial bias.


Can This AI Filter Protect Identities From Facial Recognition System?

#artificialintelligence

Facial recognition technology has been a matter of grave concern since long, as much as to that, major tech giants like Microsoft, Amazon, IBM as well as Google have earlier this year, banned selling their FRT to police authorities. Additionally, Clearview AI's groundbreaking facial recognition app that scrapped billions of images of people without consent made the matter even worse for the public. In fact, the whole concept of companies using social media images of people without their permission to train their FRT algorithms can turn out to be troublesome for the general public's identity and personal privacy. And thus, to protect human identities from companies who can misuse them, researchers from the computer science department of the University of Chicago, proposed an AI system to fool these facial recognition systems. Termed as Fawkes -- named after the British soldier Guy Fawkes Night, this AI system has been designed to help users to safeguard their images and selfies with a filter from against these unfavored facial recognition models. This filter, as the researchers called it "cloak," adds an invisible pixel-level change on the photos that cannot be seen with human eyes, but can deceive these FRTs.


Protect Your Profile Photo with a Privacy Cloak

#artificialintelligence

Do we give importance to the privacy of our profile photos publicly available around social media? Have we ever bothered about privacy when we share innumerable photos of friends and family members on Facebook or Instagram? But why should we pay importance to the privacy protection of photos in the first place? We should because our publicly available photos could be utilized for unauthorized facial recognition and that can invade our private lives. There is little doubt that facial recognition is a serious threat to privacy.


This Filter Makes Your Photos Invisible to Facial Recognition

#artificialintelligence

In 2020, it's safe to assume that any photo uploaded and made public to the internet will be analyzed by facial recognition. Not only do companies like Google and Facebook apply facial recognition as a feature, but companies like Clearview AI have been discreetly scraping images from the public internet in order to sell facial recognition technology to police for years. Now, A.I. researchers are starting to think about how technology can solve the problem it created. These algorithms aren't the solution to privacy on the web -- and they don't claim to be. But they're tools that, if adopted by online platforms, could claw back a little of the privacy typically lost by posting images online.