Amazon needs to come clean about racial bias in its algorithms


Yesterday, Amazon's quiet Rekognition program became very public, as new documents obtained by the ACLU of Northern California showed the system partnering with the city of Orlando and police camera vendors like Motorola Solutions for an aggressive new real-time facial recognition service. Amazon insists that the service is a simple object-recognition tool and will only be used for legal purposes. But even if we take the company at its word, the project raises serious concerns, particularly around racial bias. Facial recognition systems have long struggled with higher error rates for women and people of color -- error rates that can translate directly into more stops and arrests for marginalized groups. And while some companies have responded with public bias testing, Amazon hasn't shared any data on the issue, if it's collected data at all.