Goto

Collaborating Authors

Facial recognition software is biased towards white men, researcher finds

#artificialintelligence

New research out of MIT's Media Lab is underscoring what other experts have reported or at least suspected before: facial recognition technology is subject to biases based on the data sets provided and the conditions in which algorithms are created. Joy Buolamwini, a researcher at the MIT Media Lab, recently built a dataset of 1,270 faces, using the faces of politicians, selected based on their country's rankings for gender parity (in other words, having a significant number of women in public office). Buolamwini then tested the accuracy of three facial recognition systems: those made by Microsoft, IBM, and Megvii of China. The results, which were originally reported in The New York Times, showed inaccuracies in gender identification dependent on a person's skin color. Gender was misidentified in less than one percent of lighter-skinned males; in up to seven percent of lighter-skinned females; up to 12 percent of darker-skinned males; and up to 35 percent in darker-skinner females.


Meet the computer scientist and activist who got Big Tech to stand down

#artificialintelligence

Today, Buolamwini is galvanizing a growing movement to expose the social consequences of artificial intelligence. Through her nearly four-year-old nonprofit, the Algorithmic Justice League (AJL), she has testified before lawmakers at the federal, state, and local levels about the dangers of using facial recognition technologies with no oversight of how they're created or deployed. Since George Floyd's death, she has called for a complete halt to police use of face surveillance, and is providing activists with resources and tools to demand regulation. Many companies, such as Clearview AI, are still selling facial analysis to police and government agencies. And many police departments are using facial recognition technologies to identify, in the words of the New York Police Department, individuals that have committed, are committing, or are about to commit crimes.


Amazon Announces 1-Year Moratorium On Police Use Of Its Facial-Recognition Technology

NPR Technology

When Amazon introduced its facial recognition technology to law enforcement agencies, the tech giant likened it to magic. Now Amazon is telling police, stop using it for the next year. It's a move that signals the impact protests over police brutality are having on the tech industry. And we should note Amazon is a financial supporter of NPR. BOBBY ALLYN, BYLINE: Amazon for years has offered a service called Rekognition to police departments.


Joy Buolamwini: How Does Facial Recognition Software See Skin Color?

NPR Technology

Facial analysis technology is often unable to recognize dark skin tones. Joy Buolamwini says this bias can lead to detrimental results -- and she urges her colleagues to create more inclusive code. As a "poet of code", computer scientist Joy Buolamwini founded the Algorithmic Justice League to fight inequality in computation. Her graduate research at the MIT Media Lab focuses on algorithmic and coded bias in Machine Learning. Buolamwini is a Fulbright Fellow, an Astronaut Scholar, a Rhodes Scholar, and a Google Anita Borg Scholar.