Is Amazon's facial recognition system RACIST?

Daily Mail

Amazon's facial recognition tool is being referred to as a'recipe for authoritarianism and disaster' after it was revealed to be used by law enforcement officials. Now experts say it raises even greater concerns, as the artificial intelligence used to power the technology could exhibit racial bias. Many are calling on Amazon to release data that shows they've trained the software to reduce bias, but it has yet to do so. A controversial facial recognition tool, dubbed Rekognition, marketed to police has been defended by its creator, online retailer Amazon. The controversy was spurred by a report from the American Civil Liberties Union (ACLU), which found that Amazon's facial recognition tool, dubbed'Rekognition', is being used by law enforcement agencies in Oregon and Florida.


Orlando Police Testing Amazon's Real-Time Facial Recognition

NPR

Tech companies are trying to sell police real-time facial recognition systems, which can track and identify people as they walk down the street. As NPR reported two weeks ago, American police have generally held off, but there's new evidence that one police department -- Orlando, Fla. -- has decided to try it out. What's more, Orlando ordered its facial recognition system from Amazon. This information was uncovered by the ACLU, which noticed that law enforcement customers were mentioned in the marketing of Amazon's "Rekognition" service. Until now, American police have used facial recognition primarily to compare still photos from crime scenes with mug shots.


Amazon's Facial Recognition System Mistakes Members of Congress for Mugshots

WIRED

Amazon touts its Rekognition facial recognition system as "simple and easy to use," encouraging customers to "detect, analyze, and compare faces for a wide variety of user verification, people counting, and public safety use cases." And yet, in a study released Thursday by the American Civil Liberties Union, the technology managed to confuse photos of 28 members of Congress with publicly available mug shots. Given that Amazon actively markets Rekognition to law enforcement agencies across the US, that's simply not good enough. The ACLU study also illustrated the racial bias that plagues facial recognition today. "Nearly 40 percent of Rekognition's false matches in our test were of people of color, even though they make up only 20 percent of Congress," wrote ACLU attorney Jacob Snow.


Amazon receives challenge from face recognition researcher over biased AI

USATODAY

Her research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. Facial recognition technology was already seeping into everyday life -- from your photos on Facebook to police scans of mugshots -- when Joy Buolamwini noticed a serious glitch: Some of the software couldn't detect dark-skinned faces like hers. That revelation sparked the Massachusetts Institute of Technology researcher to launch a project that's having an outsize influence on the debate over how artificial intelligence should be deployed in the real world. Her tests on software created by brand-name tech firms such as Amazon uncovered much higher error rates in classifying the gender of darker-skinned women than for lighter-skinned men.


Amazon facial-identification software used by police falls short on tests for accuracy and bias, new research finds

Washington Post - Technology News

Facial-recognition software developed by Amazon and marketed to local and federal law enforcement as a powerful crime-fighting tool struggles to pass basic tests of accuracy, such as correctly identifying a person's gender, new research released Thursday says. Researchers with M.I.T. Media Lab also said Amazon's Rekognition system performed more accurately when assessing lighter-skinned faces, raising concerns about how biased results could tarnish the artificial-intelligence technology's use by police and in public venues, including airports and schools. Amazon's system performed flawlessly in predicting the gender of lighter-skinned men, the researchers said, but misidentified the gender of darker-skinned women in roughly 30 percent of their tests. Rival facial-recognition systems from Microsoft and other companies performed better but were also error-prone, they said. The problem, AI researchers and engineers say, is that the vast sets of images the systems have been trained on skew heavily toward white men.