Goto

Collaborating Authors

Facial-recognition technology works best if you're a white guy, study says

#artificialintelligence

Facial-recognition technology is improving by leaps and bounds. Some commercial software can now tell the gender of a person in a photograph. When the person in the photo is a white man, the software is right 99 percent of the time. But the darker the skin, the more errors arise -- up to nearly 35 percent for images of darker-skinned women, according to a new study that breaks fresh ground by measuring how the technology works on people of different races and gender. These disparate results, calculated by Joy Buolamwini, a researcher at the Massachusetts Institute of Technology Media Lab, show how some of the biases in the real world can seep into artificial intelligence, the computer systems that inform facial recognition.


Amazon's face-detection technology for police shows bias, researchers say

The Japan Times

NEW YORK - Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto. Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities. Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits. The researchers said that in their tests, Amazon's technology labeled darker-skinned women as men 31 percent of the time. Lighter-skinned women were misidentified 7 percent of the time.


Facial recognition technology is finally more accurate in identifying people of color. Could that be used against immigrants?

Washington Post - Technology News

Microsoft this week announced its facial-recognition system is now more accurate in identifying people of color, touting its progress at tackling one of the technology's biggest biases. But critics, citing Microsoft's work with Immigration and Customs Enforcement, quickly seized on how that improved technology might be used. The agency contracts with Microsoft for a set of cloud-computing tools that the tech giant says is largely limited to office work, but which can also include face recognition. Columbia University professor Alondra Nelson tweeted, "We must stop confusing'inclusion' in more'diverse' surveillance systems with justice and equality." Today's facial-recognition systems more often misidentify people of color because of a long-running data problem: The massive sets of facial images they train on skew heavily toward white men.


Researchers say Amazon face-detection technology shows bias

FOX News

Why the American Civil Liberties Union is calling out Amazon's facial recognition tool, and what the ACLU found when it compared photos of members of Congress to public arrest photos. Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto. Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities. Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits. The researchers said that in their tests, Amazon's technology labeled darker-skinned women as men 31 percent of the time.


Microsoft tweaks facial-recognition tech to combat bias

FOX News

Microsoft's facial-recognition technology is getting smarter at recognizing people with darker skin tones. On Tuesday, the company touted the progress, though it comes amid growing worries that these technologies will enable surveillance against people of color. Microsoft's announcement didn't broach the concerns; the company merely addressed how its facial-recognition tech could misidentify both men and women with darker skin tones. Microsoft has recently reduced the system's error rates by up to 20 times. In February, research from MIT and Stanford University highlighted how facial-recognition technologies can be built with bias.