AI Bias: When Algorithms Go Bad
Earlier this month researchers from the Massachusetts Institute of Technology and Stanford University reported that they had found that three commercial facial-analysis programs from major tech companies showed bias in both skin-type and gender. The error rates for determining the gender of light-skinned men were 0.8% compared with much higher error rates for darker-skinned women, which in some cases was as much as 20% and 34%. This is not the first time an algorithm powering an AI application has delivered an erroneous -- to say nothing of embarrassing -- result. In 2015, Flickr, a photo-sharing site owned by Yahoo launched image-recognition software that automatically created tags for photos. Some of the tags being created were highly offensive -- such as "sport" and "jungle gym" for pictures of concentration camps and "ape" for pictures of humans including an African American man.
Mar-4-2018, 13:46:13 GMT
- Country:
- North America > United States > Massachusetts (0.25)
- Industry:
- Information Technology > Services (0.71)
- Technology: