Goto

Collaborating Authors

Results


Despite what you may think, face recognition surveillance isn't inevitable

#artificialintelligence

Last year, communities banded together to prove that they can--and will--defend their privacy rights. As part of ACLU-led campaigns, three California cities--San Francisco, Berkeley, and Oakland--as well as three Massachusetts municipalities--Somerville, Northhampton, and Brookline--banned the government's use of face recognition from their communities. Following another ACLU effort, the state of California blocked police body cam use of the technology, forcing San Diego's police department to shutter its massive face surveillance flop. And in New York City, tenants successfully fended off their landlord's efforts to install face surveillance. Even the private sector demonstrated it had a responsibility to act in the face of the growing threat of face surveillance.


Can we trust tech giants with our faces? Google, Amazon and Microsoft can't agree on how to protect us

USATODAY - Tech Top Stories

A top Google executive recently sent a shot across the bow of its competitors regarding face surveillance. Kent Walker, the company's general counsel and senior vice president of global affairs, made it clear that Google -- unlike Amazon and Microsoft -- will not sell a face recognition product until the technology's potential for abuse is addressed. Face recognition, powered by artificial intelligence, could allow the government to supercharge surveillance by automating identification and tracking. Authorities could use it to track protesters, target vulnerable communities (such as immigrants), and create digital policing in communities of color that are already subject to pervasive police monitoring. So how are the world's biggest technology companies responding to this serious threat to privacy, safety and civil rights?


Amazon investors press company to stop selling 'racially biased' surveillance tech to government agencies

FOX News

Why the American Civil Liberties Union is calling out Amazon's facial recognition tool, and what the ACLU found when it compared photos of members of Congress to public arrest photos. A group of Amazon shareholders is pushing the tech giant to stop selling its controversial facial recognition technology to U.S. government agencies, just days after a coalition of 85 human rights, faith, and racial justice groups demanded in an open letter that Jeff Bezos' company stop marketing surveillance technology to the feds. Over the last year, the "Rekognition" technology, which has been reportedly marketed to the U.S. Immigration and Customs Enforcement (ICE), has come under fire from immigrants' rights groups and privacy advocates who argue that it can be misused and ultimately lead to racially biased outcomes. A test of the technology by the American Civil Liberties Union (ACLU) showed that 28 members of Congress, mostly people of color, were incorrectly identified as police suspects. According to media reports and the ACLU, Amazon has already sold or marketed "Rekognition" to law enforcement agencies in three states.


Amazon's Face Recognition Falsely Matched 28 Members of Congress With Mugshots

#artificialintelligence

Amazon's face surveillance technology is the target of growing opposition nationwide, and today, there are 28 more causes for concern. In a test the ACLU recently conducted of the facial recognition tool, called "Rekognition," the software incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime. The members of Congress who were falsely matched with the mugshot database we used in the test include Republicans and Democrats, men and women, and legislators of all ages, from all across the country. Our test used AmazonRekognition to compare images of members of Congress with a database of mugshots. The results included 28 incorrect matches.


Amazon's Facial Recognition Tool Falsely Matched 28 Members of Congress to Mug Shots

Slate

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society. The ACLU released a report on Thursday revealing that Rekognition, Amazon's facial recognition tool, had falsely matched 28 members of Congress to mug shots. Members of the ACLU purchased the version of Rekognition that Amazon offers to the general public and ran public photos of every member of the House and Senate against a database of 25,000 arrest photos. The entire experiment costed $12.33, which, as ACLU attorney Jake Snow writes in a blogpost, is "less than a large pizza." Almost 40 percent of the representatives that Rekognition falsely matched were people of color, even though they make up only 20 percent of Congress.


Lawmakers need to curb face recognition searches by police

Los Angeles Times

When is it appropriate for police to conduct a face recognition search? To figure out who's who in a crowd of protesters? To monitor foot traffic in a high-crime neighborhood? To confirm the identity of a suspect -- or a witness -- caught on tape? According to a new report by Georgetown Law's Center on Privacy & Technology, these are questions very few police departments asked before widely deploying face recognition systems.