Facebook will face a class action law suit in the wake of its privacy scandal, a US federal judge has ruled. Allegations of privacy violations emerged when it was revealed the app used a photo-scanning tool on users' images without their explicit consent. The facial recognition tool, launched in 2010, suggests names for people it identifies in photos uploaded by users. Under Illinois state law, the company could be fined $1,000 to $5,000 (£700 - £3,500) each time a person's image was used without consent. The technology was suspended for users in Europe in 2012 over privacy fears but is still live in the US and other regions worldwide.
This week's furor over FaceApp has largely centered on concerns that its Russian developers might be compelled to share the app's data with the Russian government, much as the Snowden disclosures illustrated the myriad ways in which American companies were compelled to disclose their private user data to the US government. Yet the reality is that this represents a mistaken understanding of just how the modern data trade works today and the simple fact that American universities and companies routinely make their data available to companies all across the world, including in Russia and China. In today's globalized world, data is just as globalized, with national borders no longer restricting the flow of our personal information - trend made worse by the data-hungry world of deep learning. Data brokers have long bought and sold our personal data in a shadowy world of international trade involving our most intimate and private information. The digital era has upended this explicit trade through the interlocking world of passive exchange through analytics services.
See how Apple's new facial recognition system works in real life. A conductive model of a finger, used to spoof a fingerprint ID system. Created by Prof. Anil Jain, a professor of computer science at Michigan State University and expert on biometric technology. SAN FRANCISCO -- Your shiny new smartphone may unlock with only your thumbprint, eye or face. The FBI is struggling to gain access to the iPhone of Texas church gunman Devin Kelley, who killed 25 people in a shooting rampage.
Amazon's online facial recognition system incorrectly matched pictures of US Congress members to mugshots of suspected criminals in a study by the American Civil Liberties Union. The ACLU, a nonprofit headquartered in New York, has called for Congress to ban cops and Feds from using any sort of computer-powered facial recognition technology due to the fact that, well, it sucks. Amazon's AI-powered Rekognition service was previously criticized by the ACLU when it revealed the web giant was aggressively marketing its face-matching tech to police in Washington County, Oregon, and Orlando, Florida. Rekognition is touted by the Bezos Bunch as, among other applications, a way to identify people in real time from surveillance camera footage or from officers' body cameras. The results from the ACLU's latest probing showed that Rekognition mistook images of 28 members of Congress for mugshots of cuffed people suspected of crimes.
US army researchers have developed a convolutional neural network and a range of algorithms to recognise faces in the dark. "This technology enables matching between thermal face images and existing biometric face databases or watch lists that only contain visible face imagery," explained Benjamin Riggan on Monday, co-author of the study and an electronics engineer at the US army laboratory. "The technology provides a way for humans to visually compare visible and thermal facial imagery through thermal-to-visible face synthesis." The thermal images are processed and passed to a convolutional neural network to extract facial features using landmarks that mark the corners of the eyes, nose and lips to determine its overall shape. The system, dubbed "multi-region synthesis" is trained with a loss function so that the error between the thermal images and the visible ones is minimized, creating an accurate portrayal of what someone's face looks like despite only glimpsing it in the dark.