Amazon face recognition wrongly tagged lawmakers as police suspects, fueling racial bias concerns

FOX News

Amazon's Rekognition facial surveillance technology has wrongly tagged 28 members of Congress as police suspects, the ACLU says. Amazon's Rekognition facial surveillance technology has wrongly tagged 28 members of Congress as police suspects, according to ACLU research, which notes that nearly 40 percent of the lawmakers identified by the system are people of color. In a blog post, Jacob Snow, technology and civil liberties attorney for the ACLU of Northern California, said that the false matches were made against a mugshot database. The matches were also disproportionately people of color, he said. These include six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis, D-Ga.


IBM Facial Recognition Dataset Aims to Tackle Gender, Racial Bias

#artificialintelligence

"We expect face recognition to work accurately for each of us" IBM is releasing a new dataset called Diversity in Faces in the hope that it will help developers tackle gender and skin type biases in facial recognition software. The dataset of one million images has been compiled using publicly available images taken from the YFCC-100M Creative Commons dataset. These images were then annotated using ten facial coding schemes, along with human-labelled gender and age notes. The development of facial recognition software has been rocky: from its use in player creation in NBA and FIFA videogames that resulted in Cronenbergesque facial models, to the gender and skin type bias experienced in modern facial-analysis programs. Last year MIT researchers found that facial-analysis software had an error rate of 34.7 percent for dark-skinned women.


San Francisco Approves Ban On Government's Use Of Facial Recognition Technology

NPR Technology

In this Oct. 31 photo, a man has his face painted to represent efforts to defeat facial recognition. It was during a protest at Amazon headquarters over the company's facial recognition system. In this Oct. 31 photo, a man has his face painted to represent efforts to defeat facial recognition. It was during a protest at Amazon headquarters over the company's facial recognition system. San Francisco has become the first U.S. city to ban the use of facial recognition technology by police and city agencies.


AI facial analysis demonstrates both racial and gender bias

Engadget

Researchers from MIT and Stanford University found that that three different facial analysis programs demonstrate both gender and skin color biases. The full article will be presented at the Conference on Fairness, Accountability, and Transparency later this month.


Microsoft says its facial recognition technology is less biased

Mashable

Microsoft claims its facial recognition technology just got a little less awful. Earlier this year, a study by MIT researchers found that tools from IBM, Microsoft, and Chinese company Megvii could correctly identify light-skinned men with 99-percent accuracy. But it incorrectly identified darker-skinned women as often as one-third of the time. Now imagine a computer incorrectly flagging an image at an airport or in a police database, and you can see how dangerous those errors could be. Microsoft's software performed poorly in the study.