Amazon face recognition falsely matches 28 lawmakers with mugshots, ACLU says

The Guardian

Amazon's facial recognition technology falsely identified 28 members of Congress as people who have been arrested for crimes, according to the American Civil Liberties Union (ACLU). The ACLU of Northern California's test of Amazon's controversial Rekognition software also found that people of color were disproportionately misidentified in a mugshot database, raising new concerns about racial bias and the potential for abuse by law enforcement. The report followed revelations in May that Amazon has been marketing and selling the Rekognition technology to police agencies, leading privacy advocates to urge CEO Jeff Bezos to stop providing the product to the government. "Our test reinforces that face surveillance is not safe for government use," Jacob Snow, a technology and civil liberties attorney at the ACLU Foundation of Northern California, said in a statement. "Face surveillance will be used to power discriminatory surveillance and policing that targets communities of color, immigrants, and activists.


Amazon's Facial Recognition System Mistakes Members of Congress for Mugshots

WIRED

Amazon touts its Rekognition facial recognition system as "simple and easy to use," encouraging customers to "detect, analyze, and compare faces for a wide variety of user verification, people counting, and public safety use cases." And yet, in a study released Thursday by the American Civil Liberties Union, the technology managed to confuse photos of 28 members of Congress with publicly available mug shots. Given that Amazon actively markets Rekognition to law enforcement agencies across the US, that's simply not good enough. The ACLU study also illustrated the racial bias that plagues facial recognition today. "Nearly 40 percent of Rekognition's false matches in our test were of people of color, even though they make up only 20 percent of Congress," wrote ACLU attorney Jacob Snow.


Facial recognition tech sucks, but it's inevitable

#artificialintelligence

These are just some of the questions being raised by lawmakers, civil libertarians, and privacy advocates in the wake of an ACLU report released last summer that claimed Amazon's facial recognition software, Rekognition, misidentified 28 members of congress as criminals. Rekognition is a general-purpose, application programming interface (API) developers can use to build applications that can detect and analyze scenes, objects, faces, and other items within images. The source of the controversy was a pilot program in which Amazon teamed up with the police departments of two cities, Orlando, Florida and Washington County, Oregon, to explore the use of facial recognition in law enforcement. In January 2019, the Daily Mail reported that the FBI has been testing Rekognition since early 2018. The Project on Government Oversight also revealed via a Freedom of Information Act request that Amazon had also pitched Rekognition to ICE in June 2018.


Amazon is under fire for selling facial recognition tools to cops

Mashable

Amazon has some explaining to do. The online retail giant has been caught providing facial recognition technology to law enforcement in Oregon and Orlando, according to documents obtained by the American Civil Liberties Union through a Freedom of Information Act Request. Emails obtained through the request show how Amazon has been advertising and selling its facial recognition product, Rekognition, for only a few dollars a month to law enforcement agencies -- in the hopes that they would encourage other agencies to sign up. The emails also show Amazon has marketed consulting services to law enforcement as well. SEE ALSO: What would an Amazon Alexa robot look like?


Amazon selling facial recognition software to police, records reveal

The Guardian

In the aftermath of the uprising in Ferguson, Missouri, over the killing of Michael Brown, police departments and policy makers around the country hit upon a supposed panacea to racist policing and police brutality: body-worn cameras. Many hailed the move as a victory for accountability. But among the few dissenters was Malkia Cyril, executive director of the Center for Media Justice and a leader in the Black Lives Matter network, who warned early and often that the cameras could become tools of surveillance against people of color because "body-worn cameras don't watch the police, they watch the community being policed, people like me". The scope and scale of that surveillance became clearer Tuesday, when the American Civil Liberties Union of Northern California released a collection of public records detailing how Amazon has been marketing and selling facial recognition software, called Amazon Rekognition, to law enforcement agencies. Amazon marketing materials promoted the idea of using Rekognition in conjunction with police body cameras in real time – exactly the outcome Cyril feared.