Amazon's facial recognition software mistakes women as men and darker-skinned women as men
Amazon's controversial facial recognition software, Rekognition, is facing renewed criticism. A new study from the MIT Media Lab found that Rekognition may have gender and racial biases. In particular, the software performed worse when identifying gender for females and darker-skinned females. Amazon's controversial facial recognition software, Rekognition, is facing renewed criticism. When the software was presented with a number of female faces, it incorrectly labeled 19 percent of them as male.
Jan-25-2019, 23:49:44 GMT
- Country:
- Africa (0.05)
- North America > United States
- Florida > Orange County
- Orlando (0.05)
- Missouri > Oregon County (0.05)
- Oregon > Washington County (0.05)
- Florida > Orange County
- Technology: