Artificial Intelligence Has a Problem With Gender and Racial Bias
I experienced this firsthand, when I was a graduate student at MIT in 2015 and discovered that some facial analysis software couldn't detect my dark-skinned face until I put on a white mask. These systems are often trained on images of predominantly light-skinned men. And so, I decided to share my experience of the coded gaze, the bias in artificial intelligence that can lead to discriminatory or exclusionary practices. Altering myself to fit the norm--in this case better represented by a white mask than my actual face--led me to realize the impact of the exclusion overhead, a term I coined to describe the cost of systems that don't take into account the diversity of humanity. How much does a person have to change themselves to function with technological systems that increasingly govern our lives? We often assume machines are neutral, but they aren't.
Feb-7-2019, 23:22:10 GMT