Google explains how artificial intelligence becomes biased against women and minorities

#artificialintelligence 

Time and again, research has shown that the machines we build reflect how we see the world, whether consciously or not. For artificial intelligence that reads text, that might mean associating the word "doctor" with men more than women, or image-recognition algorithms that misclassify black people as gorillas. Google, which was responsible for the gorilla error in 2015, is now trying to educate the masses on how AI can accidentally perpetuate the biases held by its makers. As an example, Google asked users to draw a shoe. Users drew a man's shoe, so the system didn't know that high heels were also shoes.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found