Humans Have the Power to Decode Bias in AI

#artificialintelligence 

Algorithms make decisions for humans every day. Some decide who gets the COVID-19 vaccine first, while others determine what candidate gets a job or which person gets undue police scrutiny. But these same systems have not been vetted for bias or discrimination -- nor do they have standards for accuracy. A discovery made by MIT Media Lab researcher Joy Buolamwini revealed that facial recognition technology does not see dark-skinned faces accurately. That finding inspired Coded Bias, a 90-minute documentary created by director/producer Shalini Kantayya.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found