Goto

Collaborating Authors

Results


Abolish the #TechToPrisonPipeline

#artificialintelligence

The authors of the Harrisburg University study make explicit their desire to provide "a significant advantage for law enforcement agencies and other intelligence agencies to prevent crime" as a co-author and former NYPD police officer outlined in the original press release.[38] At a time when the legitimacy of the carceral state, and policing in particular, is being challenged on fundamental grounds in the United States, there is high demand in law enforcement for research of this nature, research which erases historical violence and manufactures fear through the so-called prediction of criminality. Publishers and funding agencies serve a crucial role in feeding this ravenous maw by providing platforms and incentives for such research. The circulation of this work by a major publisher like Springer would represent a significant step towards the legitimation and application of repeatedly debunked, socially harmful research in the real world. To reiterate our demands, the review committee must publicly rescind the offer for publication of this specific study, along with an explanation of the criteria used to evaluate it. Springer must issue a statement condemning the use of criminal justice statistics to predict criminality and acknowledging their role in incentivizing such harmful scholarship in the past. Finally, all publishers must refrain from publishing similar studies in the future.


Despite what you may think, face recognition surveillance isn't inevitable

#artificialintelligence

Last year, communities banded together to prove that they can--and will--defend their privacy rights. As part of ACLU-led campaigns, three California cities--San Francisco, Berkeley, and Oakland--as well as three Massachusetts municipalities--Somerville, Northhampton, and Brookline--banned the government's use of face recognition from their communities. Following another ACLU effort, the state of California blocked police body cam use of the technology, forcing San Diego's police department to shutter its massive face surveillance flop. And in New York City, tenants successfully fended off their landlord's efforts to install face surveillance. Even the private sector demonstrated it had a responsibility to act in the face of the growing threat of face surveillance.


Why facial recognition's racial bias problem is so hard to crack

#artificialintelligence

Jimmy Gomez is a California Democrat, a Harvard graduate and one of the few Hispanic lawmakers serving in the US House of Representatives. But to Amazon's facial recognition system, he looks like a potential criminal. Gomez was one of 28 US Congress members falsely matched with mugshots of people who've been arrested, as part of a test the American Civil Liberties Union ran last year of the Amazon Rekognition program. Nearly 40 percent of the false matches by Amazon's tool, which is being used by police, involved people of color. This is part of a CNET special report exploring the benefits and pitfalls of facial recognition.


Making face recognition less biased doesn't make it less scary

MIT Technology Review

In the past few years, there's been a dramatic rise in the adoption of face recognition, detection, and analysis technology. You're probably most familiar with recognition systems, like Facebook's photo-tagging recommender and Apple's FaceID, which can identify specific individuals. Detection systems, on the other hand, determine whether a face is present at all; and analysis systems try to identify aspects like gender and race. All of these systems are now being used for a variety of purposes, from hiring and retail to security and surveillance. Many people believe that such systems are both highly accurate and impartial.


Amazon investors press company to stop selling 'racially biased' surveillance tech to government agencies

FOX News

Why the American Civil Liberties Union is calling out Amazon's facial recognition tool, and what the ACLU found when it compared photos of members of Congress to public arrest photos. A group of Amazon shareholders is pushing the tech giant to stop selling its controversial facial recognition technology to U.S. government agencies, just days after a coalition of 85 human rights, faith, and racial justice groups demanded in an open letter that Jeff Bezos' company stop marketing surveillance technology to the feds. Over the last year, the "Rekognition" technology, which has been reportedly marketed to the U.S. Immigration and Customs Enforcement (ICE), has come under fire from immigrants' rights groups and privacy advocates who argue that it can be misused and ultimately lead to racially biased outcomes. A test of the technology by the American Civil Liberties Union (ACLU) showed that 28 members of Congress, mostly people of color, were incorrectly identified as police suspects. According to media reports and the ACLU, Amazon has already sold or marketed "Rekognition" to law enforcement agencies in three states.


Amazon's facial recognition tool misidentified 28 members of Congress in ACLU test

USATODAY - Tech Top Stories

SAN FRANCISCO -- Amazon's controversial facial recognition program, Rekognition, falsely identified 28 members of Congress during a test of the program by the American Civil Liberties Union, the civil rights group said Thursday. In its test, the ACLU scanned photos of all members of Congress and had the system compare them with a public database of 25,000 mugshots. The group used the default "confidence threshold" setting of 80 percent for Rekognition, meaning the test counted a face match at 80 percent certainty or more. At that setting, the system misidentified 28 members of Congress, a disproportionate number of whom were people of color, tagging them instead as entirely different people who have been arrested for a crime. The faces of members of Congress used in the test include Republicans and Democrats, men and women and legislators of all ages.


Lawmakers need to curb face recognition searches by police

Los Angeles Times

When is it appropriate for police to conduct a face recognition search? To figure out who's who in a crowd of protesters? To monitor foot traffic in a high-crime neighborhood? To confirm the identity of a suspect -- or a witness -- caught on tape? According to a new report by Georgetown Law's Center on Privacy & Technology, these are questions very few police departments asked before widely deploying face recognition systems.