When Bias Is Coded Into Our Technology

NPR Technology 

Facial recognition systems from large tech companies often incorrectly classify black women as male -- including the likes of Michelle Obama, Serena Williams and Sojourner Truth. That's according to Joy Buolamwini, whose research caught wide attention in 2018 with "AI, Ain't I a Woman?" a spoken-word piece based on her findings at MIT Media Lab. The video, along with the accompanying research paper written with Timnit Gebru of Microsoft Research, prompted many tech companies to reassess their facial recognition data sets and algorithms for darker and more female-looking faces. "Coded Bias," a documentary directed by Shalini Kantayya which premiered at the Sundance Film Festival in late January, interweaves Buolamwini's journey of creating the Algorithmic Justice League, an advocacy organization, with other examples of facial recognition software being rolled out around the world -- on the streets of London, in housing projects in Brooklyn and broadly across China. Jennifer 8. Lee, a journalist and documentary producer, caught up with Joy Buolamwini and Shalini Kantayya in Park City, Utah after the premiere of Coded Bias.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found