Erik Learned-Miller is one reason we talk about facial recognition at all. In 2007, years before the current A.I. boom made "deep learning" and "neural networks" common phrases in Silicon Valley, Learned-Miller and three colleagues at the University of Massachusetts Amherst released a dataset of faces titled Labelled Faces in the Wild. To you or me, Labelled Faces in the Wild just looks like folders of unremarkable images. You can download them and look for yourself. There's boxer Joe Gatti, gloves raised mid-fight.
Last year, communities banded together to prove that they can--and will--defend their privacy rights. As part of ACLU-led campaigns, three California cities--San Francisco, Berkeley, and Oakland--as well as three Massachusetts municipalities--Somerville, Northhampton, and Brookline--banned the government's use of face recognition from their communities. Following another ACLU effort, the state of California blocked police body cam use of the technology, forcing San Diego's police department to shutter its massive face surveillance flop. And in New York City, tenants successfully fended off their landlord's efforts to install face surveillance. Even the private sector demonstrated it had a responsibility to act in the face of the growing threat of face surveillance.
Amazon's online facial recognition system incorrectly matched pictures of US Congress members to mugshots of suspected criminals in a study by the American Civil Liberties Union. The ACLU, a nonprofit headquartered in New York, has called for Congress to ban cops and Feds from using any sort of computer-powered facial recognition technology due to the fact that, well, it sucks. Amazon's AI-powered Rekognition service was previously criticized by the ACLU when it revealed the web giant was aggressively marketing its face-matching tech to police in Washington County, Oregon, and Orlando, Florida. Rekognition is touted by the Bezos Bunch as, among other applications, a way to identify people in real time from surveillance camera footage or from officers' body cameras. The results from the ACLU's latest probing showed that Rekognition mistook images of 28 members of Congress for mugshots of cuffed people suspected of crimes.
These are just some of the questions being raised by lawmakers, civil libertarians, and privacy advocates in the wake of an ACLU report released last summer that claimed Amazon's facial recognition software, Rekognition, misidentified 28 members of congress as criminals. Rekognition is a general-purpose, application programming interface (API) developers can use to build applications that can detect and analyze scenes, objects, faces, and other items within images. The source of the controversy was a pilot program in which Amazon teamed up with the police departments of two cities, Orlando, Florida and Washington County, Oregon, to explore the use of facial recognition in law enforcement. In January 2019, the Daily Mail reported that the FBI has been testing Rekognition since early 2018. The Project on Government Oversight also revealed via a Freedom of Information Act request that Amazon had also pitched Rekognition to ICE in June 2018.
See how Apple's new facial recognition system works in real life. A conductive model of a finger, used to spoof a fingerprint ID system. Created by Prof. Anil Jain, a professor of computer science at Michigan State University and expert on biometric technology. SAN FRANCISCO -- Your shiny new smartphone may unlock with only your thumbprint, eye or face. The FBI is struggling to gain access to the iPhone of Texas church gunman Devin Kelley, who killed 25 people in a shooting rampage.