Amazon's facial recognition technology falsely identified 28 members of Congress as people who have been arrested for crimes, according to the American Civil Liberties Union (ACLU). The ACLU of Northern California's test of Amazon's controversial Rekognition software also found that people of color were disproportionately misidentified in a mugshot database, raising new concerns about racial bias and the potential for abuse by law enforcement. The report followed revelations in May that Amazon has been marketing and selling the Rekognition technology to police agencies, leading privacy advocates to urge CEO Jeff Bezos to stop providing the product to the government. "Our test reinforces that face surveillance is not safe for government use," Jacob Snow, a technology and civil liberties attorney at the ACLU Foundation of Northern California, said in a statement. "Face surveillance will be used to power discriminatory surveillance and policing that targets communities of color, immigrants, and activists.
SAN FRANCISCO -- Amazon's controversial facial recognition program, Rekognition, falsely identified 28 members of Congress during a test of the program by the American Civil Liberties Union, the civil rights group said Thursday. In its test, the ACLU scanned photos of all members of Congress and had the system compare them with a public database of 25,000 mugshots. The group used the default "confidence threshold" setting of 80 percent for Rekognition, meaning the test counted a face match at 80 percent certainty or more. At that setting, the system misidentified 28 members of Congress, a disproportionate number of whom were people of color, tagging them instead as entirely different people who have been arrested for a crime. The faces of members of Congress used in the test include Republicans and Democrats, men and women and legislators of all ages.
Amazon's face surveillance technology is the target of growing opposition nationwide, and today, there are 28 more causes for concern. In a test the ACLU recently conducted of the facial recognition tool, called "Rekognition," the software incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime. The members of Congress who were falsely matched with the mugshot database we used in the test include Republicans and Democrats, men and women, and legislators of all ages, from all across the country. Our test used AmazonRekognition to compare images of members of Congress with a database of mugshots. The results included 28 incorrect matches.
Amazon's Rekognition facial surveillance technology has wrongly tagged 28 members of Congress as police suspects, the ACLU says. Amazon's Rekognition facial surveillance technology has wrongly tagged 28 members of Congress as police suspects, according to ACLU research, which notes that nearly 40 percent of the lawmakers identified by the system are people of color. In a blog post, Jacob Snow, technology and civil liberties attorney for the ACLU of Northern California, said that the false matches were made against a mugshot database. The matches were also disproportionately people of color, he said. These include six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis, D-Ga.
Amazon touts its Rekognition facial recognition system as "simple and easy to use," encouraging customers to "detect, analyze, and compare faces for a wide variety of user verification, people counting, and public safety use cases." And yet, in a study released Thursday by the American Civil Liberties Union, the technology managed to confuse photos of 28 members of Congress with publicly available mug shots. Given that Amazon actively markets Rekognition to law enforcement agencies across the US, that's simply not good enough. The ACLU study also illustrated the racial bias that plagues facial recognition today. "Nearly 40 percent of Rekognition's false matches in our test were of people of color, even though they make up only 20 percent of Congress," wrote ACLU attorney Jacob Snow.