AI claims to be able to thwart facial recognition software, making you "invisible"

#artificialintelligence

A team of engineering researchers from the University of Toronto has created an algorithm to dynamically disrupt facial recognition systems. Led by professor Parham Aarabi and graduate student Avishek Bose, the team used a deep learning technique called "adversarial training", which pits two artificial intelligence algorithms against each other. Aarabi and Bose designed a set of two neural networks, the first one identifies faces and the other works on disrupting the facial recognition task of the first. The two constantly battle and learn from each other, setting up an ongoing AI arms race. "The disruptive AI can'attack' what the neural net for the face detection is looking for," Bose said in an interview.


Facebook wants your face data -- in the name of privacy, it says

Washington Post - Technology News

Facebook will let you know when someone posts a photo of you -- even if you aren't tagged in it -- becoming the latest tech giant to add more facial recognition technology into users' everyday lives.


Facial recognition tech sucks, but it's inevitable

#artificialintelligence

These are just some of the questions being raised by lawmakers, civil libertarians, and privacy advocates in the wake of an ACLU report released last summer that claimed Amazon's facial recognition software, Rekognition, misidentified 28 members of congress as criminals. Rekognition is a general-purpose, application programming interface (API) developers can use to build applications that can detect and analyze scenes, objects, faces, and other items within images. The source of the controversy was a pilot program in which Amazon teamed up with the police departments of two cities, Orlando, Florida and Washington County, Oregon, to explore the use of facial recognition in law enforcement. In January 2019, the Daily Mail reported that the FBI has been testing Rekognition since early 2018. The Project on Government Oversight also revealed via a Freedom of Information Act request that Amazon had also pitched Rekognition to ICE in June 2018.


These patterned glasses are all it takes to fool AI-powered facial recognition ZDNet

AITopics Original Links

The researchers have shown how it's possible to perturb facial recognition with patterned eyeglass frames. Researchers have developed patterned eyeglass frames that can trick facial-recognition algorithms into seeing someone else's face. The printed frames allowed three researchers from Carnegie Mellon to successfully dodge a facial-recognition system based on machine-learning 80 percent of the time. Using certain variants of the frames, a white male was also able to fool the algorithm into mistaking him for movie actress Milla Jovovich, while a South-Asian female tricked it into seeing a Middle Eastern male. A look at some of the best IoT and smart city projects which aim to make the lives of citizens better.


Facial recognition tech makes it official: There is no privacy anymore

#artificialintelligence

Recent weeks have brought controversy over electronic billboards in restaurants and shopping precincts that utilize advanced facial recognition techniques to not only provide personalized advertisements but also measure and record the consumer and their response, ostensibly to enable retailers to provide more targeted marketing and services. In Oslo, the restaurant Peppe's Pizza had its usage of such billboards exposed due to a crashed digital advertisement that revealed the coding behind its facial recognition system. The billboard includes a camera and facial recognition software that can register gender, whether the watcher is young or an adult, facial expression, whether they wear glasses. In response, Dublin-based designer Youssef Sarhan did a little digging in his home time of Dublin and also discovered similar billboards in operation. "Your attention (and the meta-data associated with it) is being relayed to advertisers without your permission or awareness, and there is no way to opt–out.