AIs that learn from photos become sexist
Image recognition AIs that have been trained by some of the most-used research-photo collections are developing sexist biases, according to a new study. University of Virginia computer science professor Vicente Ordóñez and colleagues tested two of the largest collections of photos and data used to train these types of AIs (including one supported by Facebook and Microsoft) and discovered that sexism was rampant. He began the research after noticing a disturbing pattern of sexism in the guesses made by the image recognition software he was building. 'It would see a picture of a kitchen and more often than not associate it with women, not men,' Ordóñez told Wired, adding it also linked women with images of shopping, washing, and even kitchen objects like forks. The AI was also associating men with stereotypically masculine activities like sports, hunting, and coaching, as well as objects sch as sporting equipment.
Aug-22-2017, 00:50:06 GMT
- Country:
- Africa > Middle East (0.04)
- Asia
- China > Guangdong Province
- Shenzhen (0.04)
- Middle East (0.04)
- China > Guangdong Province
- Europe > Middle East (0.04)
- North America > United States
- Virginia (0.25)
- Genre:
- Research Report > New Finding (0.34)
- Industry:
- Law > Civil Rights & Constitutional Law (0.30)
- Technology: