'I think my blackness is interfering': does facial recognition show racial bias?

#artificialintelligence 

Cameras are used routinely by police across the US to identify citizens, their faces cross-matched against databases of suspects and past criminals. Yet researchers claim there is too little scrutiny of how these tools work, and have found inherent racial bias in the system. So does a sophisticated, visual analysis tool reflect human prejudice and if so, who does that effect? "Studies indicate there's racial bias in the software," said Jonathan Frankle, staff technologist at Georgetown Law School. Working with law fellow Clare Garvie, Frankle has requested public information from more than 100 police departments across the country.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found