Humans Have the Power to Decode Bias in AI
Algorithms make decisions for humans every day. Some decide who gets the COVID-19 vaccine first, while others determine what candidate gets a job or which person gets undue police scrutiny. But these same systems have not been vetted for bias or discrimination -- nor do they have standards for accuracy. A discovery made by MIT Media Lab researcher Joy Buolamwini revealed that facial recognition technology does not see dark-skinned faces accurately. That finding inspired Coded Bias, a 90-minute documentary created by director/producer Shalini Kantayya.
Feb-27-2021, 10:05:26 GMT
- Country:
- Asia > China (0.05)
- Europe > United Kingdom (0.05)
- North America > United States
- Illinois > Cook County > Chicago (0.05)
- Industry:
- Technology: