Goto

Collaborating Authors

police


Ed Markey, Ayanna Pressley push for federal ban on facial recognition technology

Boston Herald

Massachusetts Sen. Ed Markey and Rep. Ayanna Pressley are pushing to ban the federal government's use of facial recognition technology, as Boston last week nixed the city use of the technology and tech giants pause their sale of facial surveillance tools to police. The momentum to stop the government use of facial recognition technology comes in the wake of the police killing of George Floyd in Minneapolis -- a black man killed by a white police officer. Floyd's death has sparked nationwide protests for racial justice and triggered calls for police reform, including ways police track people. Facial recognition technology contributes to the "systemic racism that has defined our society," Markey said on Sunday. "We cannot ignore that facial recognition technology is yet another tool in the hands of law enforcement to profile and oppress people of color in our country," Markey said during an online press briefing.


The little-known AI firms whose facial recognition tech led to a false arrest

#artificialintelligence

Robert Williams went to jail because a computer--and a pair of Detroit police officers--made a mistake. The officers relied on facial recognition software to identify Williams as a suspect in a 15-month-old shoplifting case. They were wrong--making Williams perhaps the first known case of a wrongful arrest resulting from faulty facial recognition. Earlier this month, IBM, Microsoft, and Amazon swore off or paused their sale of facial recognition tools to US police and called on Congress to regulate the technology. It was sold by police contractor DataWorks Plus, and powered by algorithms from Japanese tech firm NEC and Colorado-based Rank One Computing.


Detroit police challenged over face recognition flaws, bias

#artificialintelligence

A Black man who was wrongfully arrested when facial recognition technology mistakenly identified him as a suspected shoplifter wants Detroit police to apologize -- and to end their use of the controversial technology. The complaint by Robert Williams is a rare challenge from someone who not only experienced an erroneous face recognition hit, but was able to discover that it was responsible for his subsequent legal troubles. The Wednesday complaint filed on Williams' behalf alleges that his Michigan driver license photo -- kept in a statewide image repository -- was incorrectly flagged as a likely match to a shoplifting suspect. Investigators had scanned grainy surveillance camera footage of an alleged 2018 theft inside a Shinola watch store in midtown Detroit, police records show. That led to what Williams describes as a humiliating January arrest in front of his wife and young daughters on their front lawn in the Detroit suburb of Farmington Hills.


Congress needs to pass this facial recognition ban now

Mashable

Earlier this year, for the first time (that we know of), a false match by a facial recognition algorithm led to the arrest of an innocent man. Now, members of Congress are finally taking action. Pramila Jayapal and Ayanna Pressley, all Democrats, introduced the Facial Recognition and Biometric Technology Moratorium Act of 2020. It's the most aggressive move yet by Congress to limit the use of facial recognition by police, in this case, by banning federal law enforcement from using it and cutting off state and local police from federal grants if they fail to do the same. That it was an innocent Black man who was falsely accused and arrested is not a surprise.


A Flawed Facial Recognition System Sent This Man to Jail

WIRED

In January, Detroit police arrested and charged 42-year-old Robert Williams with stealing $4,000 in watches from a retail store 15 months earlier. Taken away in handcuffs in front of his two children, Williams was sent to an interrogation room where police presented him with their evidence: Facial recognition software matched his driver's license photo with surveillance footage from the night of the crime. Williams had an alibi, The New York Times reports, and immediately denied the charges. Police pointed to the image of the suspect from the night of the theft. "I just see a big black guy," he told NPR.


Boston bans most city use of facial-recognition tech in privacy win

Mashable

Boston on Wednesday joined the still small, but growing, number of U.S. cities that have for the most part banned city officials' use of facial-recognition technology. The ordinance, sponsored by Councilor Michelle Wu and Councilor Ricardo Arroyo, passed by a unanimous vote according to Councilor Wu. The new measure prohibits both the city of Boston and any official in the city of Boston from using "face surveillance" and "information derived from a face surveillance system." There are, importantly, a few key exceptions. One such exception, allowing city employees to still use technology like Face ID to unlock their personal smartphones, is uncontroversial.


False facial recognition match leads to a wrongful arrest in Detroit

Engadget

Many critics of police facial recognition use warn of the potential for racial bias that leads to false arrests, and unfortunately that appears to have happened. The ACLU has filed a complaint against Detroit police for the wrongful arrest of Robert Williams when a DataWorks Plus facial recognition system incorrectly matched security footage against Williams' driver's license, marking him as a suspect. Officers showed the match to an offsite security consultant who identified Williams as the culprit, but this person never saw the perpetrator first-hand. The ACLU argued that the DataWorks system "can't tell Black people apart" and that the whole system was "tainted" by officers' assumptions that the facial recognition system produced the right suspect. In a Washington Post opinion piece, Williams added that he was concerned about the tech even if it was completely accurate -- he didn't want his daughters' faces to go into a database and prompt future police questioning when they're spotted at a "protest the government didn't like."


'The Computer Got It Wrong': How Facial Recognition Led To A False Arrest In Michigan

NPR Technology

A photo of the alleged suspect in a theft case in Detroit, left, next to the driver's license photo of Robert Williams. An algorithm said Williams was the suspect, but he and his lawyers say the tool produced a false hit. A photo of the alleged suspect in a theft case in Detroit, left, next to the driver's license photo of Robert Williams. An algorithm said Williams was the suspect, but he and his lawyers say the tool produced a false hit. Police in Detroit were trying to figure out who stole five watches from a Shinola retail store.


Black roboticists on racism, bias, and building better AI

#artificialintelligence

Jasmine Lawrence works with the Everyday Robots project from Alphabet's X moonshot factory. She thinks there's a lot of unanswered ethical questions about how to use robots and how to think of them: Are they slaves or tools? Do they replace or complement people? As a product manager, she said, confronting some of those questions can be frightening, and it brings up the question of bias and the responsibility of the creator. Lawrence said she wants to be held accountable for the good and bad things she builds.


'How did this happen?': Facial recognition slowly being trialled around the country

#artificialintelligence

When Lauren Dry heard last year that facial recognition cameras were being trialled in the suburb of East Perth, she thought it was a joke. "I just thought to myself: What do you mean facial recognition cameras, that's sci-fi! That doesn't happen in Perth," she told 7.30. "And I looked into it and I was, like, this is real." Ms Dry enjoys a quiet life at home with her young family in Perth's leafy suburbs.