Amazon's face-detection technology for police shows bias, researchers say

The Japan Times

NEW YORK - Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto. Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities. Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits. The researchers said that in their tests, Amazon's technology labeled darker-skinned women as men 31 percent of the time. Lighter-skinned women were misidentified 7 percent of the time.


Researchers say Amazon face-detection technology shows bias

FOX News

Why the American Civil Liberties Union is calling out Amazon's facial recognition tool, and what the ACLU found when it compared photos of members of Congress to public arrest photos. Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto. Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities. Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits. The researchers said that in their tests, Amazon's technology labeled darker-skinned women as men 31 percent of the time.


Amazon receives challenge from face recognition researcher over biased AI

USATODAY - Tech Top Stories

Her research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. Facial recognition technology was already seeping into everyday life -- from your photos on Facebook to police scans of mugshots -- when Joy Buolamwini noticed a serious glitch: Some of the software couldn't detect dark-skinned faces like hers. That revelation sparked the Massachusetts Institute of Technology researcher to launch a project that's having an outsize influence on the debate over how artificial intelligence should be deployed in the real world. Her tests on software created by brand-name tech firms such as Amazon uncovered much higher error rates in classifying the gender of darker-skinned women than for lighter-skinned men.


Amazon facial-identification software used by police falls short on tests for accuracy and bias, new research finds

Washington Post - Technology News

Facial-recognition software developed by Amazon and marketed to local and federal law enforcement as a powerful crime-fighting tool struggles to pass basic tests of accuracy, such as correctly identifying a person's gender, new research released Thursday says. Researchers with M.I.T. Media Lab also said Amazon's Rekognition system performed more accurately when assessing lighter-skinned faces, raising concerns about how biased results could tarnish the artificial-intelligence technology's use by police and in public venues, including airports and schools. Amazon's system performed flawlessly in predicting the gender of lighter-skinned men, the researchers said, but misidentified the gender of darker-skinned women in roughly 30 percent of their tests. Rival facial-recognition systems from Microsoft and other companies performed better but were also error-prone, they said. The problem, AI researchers and engineers say, is that the vast sets of images the systems have been trained on skew heavily toward white men.


Amazon's facial recognition software mistakes women as men and darker-skinned women as men

Daily Mail - Science & tech

Amazon's controversial facial recognition software, Rekognition, is facing renewed criticism. A new study from the MIT Media Lab found that Rekognition may have gender and racial biases. In particular, the software performed worse when identifying gender for females and darker-skinned females. Amazon's controversial facial recognition software, Rekognition, is facing renewed criticism. When the software was presented with a number of female faces, it incorrectly labeled 19 percent of them as male.