Facial-recognition software developed by Amazon and marketed to local and federal law enforcement as a powerful crime-fighting tool struggles to pass basic tests of accuracy, such as correctly identifying a person's gender, new research released Thursday says. Researchers with M.I.T. Media Lab also said Amazon's Rekognition system performed more accurately when assessing lighter-skinned faces, raising concerns about how biased results could tarnish the artificial-intelligence technology's use by police and in public venues, including airports and schools. Amazon's system performed flawlessly in predicting the gender of lighter-skinned men, the researchers said, but misidentified the gender of darker-skinned women in roughly 30 percent of their tests. Rival facial-recognition systems from Microsoft and other companies performed better but were also error-prone, they said. The problem, AI researchers and engineers say, is that the vast sets of images the systems have been trained on skew heavily toward white men.
Amazon.com's facial recognition tools incorrectly identified Rep. John Lewis (D-Ga.) and 27 other members of Congress as people arrested for a crime during a test commissioned by the American Civil Liberties Union of Northern California, the watchdog said Thursday. The ACLU said its findings show that Amazon's so-called Rekognition technology -- already in use at law-enforcement agencies in Oregon and Orlando -- is hampered by inaccuracies that disproportionately put people of color at risk and should prompt regulators to halt "law enforcement use of face surveillance." Amazon chief executive Jeffrey P. Bezos owns The Washington Post. For its test, the ACLU of Northern California created a database of 25,000 publicly available arrest photos, though the civil liberties watchdog did not give details about where it obtained the images or the kinds of individuals in the photos. It then used Amazon's Rekognition software to compare that database against photos of every member of the U.S. House and Senate.
Amazon.com pitched its facial-recognition system in the summer to Immigration and Customs Enforcement officials as a way for the agency to target or identify immigrants, a move that could shove the tech giant further into a growing debate over the industry's work with the government. The June meeting in Silicon Valley was revealed in emails as part of a Freedom of Information Act request by the advocacy group Project on Government Oversight; the emails were published first in the Daily Beast. They show that officials from ICE and Amazon Web Services talked about implementing the company's Rekognition face-scanning platform to assist with homeland security investigations. An Amazon Web Services official who specializes in federal sales contracts, and whose name was redacted in the emails, wrote that the conversation involved "predictive analytics" and "Rekognition Video tagging/analysis" that could possibly allow ICE to identify people's faces from afar -- a type of technology immigration officials have voiced interest in for its potential enforcement use on the southern border. "We are ready and willing to support the vital (Homeland Security Investigations) mission," the Amazon official wrote.
In Washington County, Oregon, sheriff's deputies use a mobile app to send photos of suspects to Amazon's cloud computing service. The e-commerce giant's algorithms check those faces against a database of tens of thousands of mugshots, using Amazon's Rekognition image analysis service. Such use of facial recognition by law enforcement is essentially unregulated. But some developers of the technology want to change that. In a blog post Thursday, Amazon asked Congress to put some rules around the use of the technology, echoing a call by Microsoft in December.
Amazon's controversial facial recognition software, Rekognition, is facing renewed criticism. A new study from the MIT Media Lab found that Rekognition may have gender and racial biases. In particular, the software performed worse when identifying gender for females and darker-skinned females. Amazon's controversial facial recognition software, Rekognition, is facing renewed criticism. When the software was presented with a number of female faces, it incorrectly labeled 19 percent of them as male.