Microsoft and Amazon are at the center of an ACLU lawsuit on facial recognition

#artificialintelligence

The American Civil Liberties Union (ACLU) is pressing forward with a lawsuit involving the facial recognition software offered by Amazon and Microsoft to government clients. In a complaint filed in a Massachusetts federal court, the ACLU asked for a variety of different records from the government, including inquiries to companies, meetings about the piloting or testing of facial recognition, voice recognition, and gait recognition technology, requests for proposals, and licensing agreements. At the heart of the lawsuit are Amazon's Rekognition and Microsoft's Face API, both facial recognition products that are available for customers of the companies' cloud platforms. The ACLU has also asked for more details on the US government's use of voice recognition and gait recognition, which is the automated process of comparing images of the way a person walks in order to identify them. Police in Shanghai and Beijing are already using gait-analysis tools to identify people.


Amazon investors press company to stop selling 'racially biased' surveillance tech to government agencies

FOX News

Why the American Civil Liberties Union is calling out Amazon's facial recognition tool, and what the ACLU found when it compared photos of members of Congress to public arrest photos. A group of Amazon shareholders is pushing the tech giant to stop selling its controversial facial recognition technology to U.S. government agencies, just days after a coalition of 85 human rights, faith, and racial justice groups demanded in an open letter that Jeff Bezos' company stop marketing surveillance technology to the feds. Over the last year, the "Rekognition" technology, which has been reportedly marketed to the U.S. Immigration and Customs Enforcement (ICE), has come under fire from immigrants' rights groups and privacy advocates who argue that it can be misused and ultimately lead to racially biased outcomes. A test of the technology by the American Civil Liberties Union (ACLU) showed that 28 members of Congress, mostly people of color, were incorrectly identified as police suspects. According to media reports and the ACLU, Amazon has already sold or marketed "Rekognition" to law enforcement agencies in three states.


Rights group files federal complaint against AI-hiring firm HireVue, citing 'unfair and deceptive' practices

#artificialintelligence

A prominent rights group is urging the Federal Trade Commission to take on the recruiting-technology company HireVue, arguing the firm has turned to unfair and deceptive trade practices in its use of face-scanning technology to assess job candidates' "employability." The Electronic Privacy Information Center, known as EPIC, on Wednesday filed an official complaint calling on the FTC to investigate HireVue's business practices, saying the company's use of unproven artificial-intelligence systems that scan people's faces and voices constituted a wide-scale threat to American workers. HireVue's "AI-driven assessments," which more than 100 employers have used on more than a million job candidates, use video interviews to analyze hundreds of thousands of data points related to a person's speaking voice, word selection and facial movements. The system then creates a computer-generated estimate of the candidates' skills and behaviors, including their "willingness to learn" and "personal stability." Candidates aren't told their scores, but employers can use those reports to decide whom to hire or disregard.


Thoughts On Machine Learning Accuracy Amazon Web Services

#artificialintelligence

Let's start with some comments about a recent ACLU blog in which they run a facial recognition trial. Using Rekognition, the ACLU built a face database using 25,000 publicly available arrest photos and then performed facial similarity searches of that database using public photos of all current members of Congress. They found 28 incorrect matches out of 535, using an 80% confidence level; this is a 5% misidentification (sometimes called'false positive') rate and a 95% accuracy rate. The ACLU has not published its data set, methodology, or results in detail, so we can only go on what they've publicly said. To illustrate the impact of confidence threshold on false positives, we ran a test where we created a face collection using a dataset of over 850,000 faces commonly used in academia.


Lawmakers need to curb face recognition searches by police

Los Angeles Times

When is it appropriate for police to conduct a face recognition search? To figure out who's who in a crowd of protesters? To monitor foot traffic in a high-crime neighborhood? To confirm the identity of a suspect -- or a witness -- caught on tape? According to a new report by Georgetown Law's Center on Privacy & Technology, these are questions very few police departments asked before widely deploying face recognition systems.