Goto

Collaborating Authors

A fight over facial recognition technology gets fiercer during the pandemic

#artificialintelligence

The long-simmering debate over facial recognition technology is taking on new urgency during the pandemic, as companies rush to pitch face-scanning systems to track the movements of Covid-19 patients. That's playing out in California, where state legislators on Tuesday will debate legislation that would regulate the use of the technology. Its most controversial element: It would permit companies and public agencies to feed people's facial data into a recognition system without their consent if there is probable cause to believe they've engaged in criminal activity. The bill isn't specifically meant for the coronavirus response, but if enacted, could shape the way that people with Covid-19 and their contacts are tracked and traced in the coming months. The legislation has won the support of Microsoft, but it has garnered opposition from more than 40 civil rights and privacy groups and from 18 public health scholars.


Controversial facial-recognition software used 30,000 times by LAPD in last decade, records show

Los Angeles Times

The Los Angeles Police Department has used facial-recognition software nearly 30,000 times since 2009, with hundreds of officers running images of suspects from surveillance cameras and other sources against a massive database of mugshots taken by law enforcement. The new figures, released to The Times, reveal for the first time how commonly facial recognition is used in the department, which for years has provided vague and contradictory information about how and whether it uses the technology. The LAPD has consistently denied having records related to facial recognition, and at times denied using the technology at all. The truth is that, while it does not have its own facial-recognition platform, LAPD personnel have access to facial-recognition software through a regional database maintained by the Los Angeles County Sheriff's Department. And between Nov. 6, 2009, and Sept. 11 of this year, LAPD officers used the system's software 29,817 times.


The Quiet Growth of Race-Detection Software Sparks Concerns Over Bias

WSJ.com: WSJD - Technology

In the last few years, companies have started using such race-detection software to understand how certain customers use their products, who looks at their ads, or what people of different racial groups like. Others use the tool to seek different racial features in stock photography collections, typically for ads, or in security, to help narrow down the search for someone in a database. In China, where face tracking is widespread, surveillance cameras have been equipped with race-scanning software to track ethnic minorities. The field is still developing, and it is an open question how companies, governments and individuals will take advantage of such technology in the future. Use of the software is fraught, as researchers and companies have begun to recognize its potential to drive discrimination, posing challenges to widespread adoption.


Facial Recognition: Should We Fear It or Embrace It?

#artificialintelligence

Facial-recognition technology is not new, but it has progressed immensely in the past few years, mainly because of advances in artificial intelligence. Naturally, this has drawn the interest of Silicon Valley, advertising agencies, hardware manufacturers, and the government. But not everyone is thrilled. The American Civil Liberties Union (ACLU) and 35 other advocacy groups, for example, sent a letter to Amazon CEO Jeff Bezos demanding that his company stop providing advanced facial-recognition technology to law enforcement, warning that it could be misused against immigrants and protesters. Early iterations of the technology, which dates back to the 1960s, were clunky.


Examining The San Francisco Facial-Recognition Ban

#artificialintelligence

On May 14, 2019, the San Francisco government became the first major city in the United States to ban the use of facial-recognition technology (paywall) by the government and law enforcement agencies. This ban comes as a part of a broader anti-surveillance ordinance. As of May 14, the ordinance was set to go into effect in about a month. Local officials and civil advocates seem to fear the repercussions of allowing facial-recognition technology to proliferate throughout San Francisco, while supporters of the software claim that the ban could limit technological progress. In this article, I'll examine the ban that just took place in San Francisco, explore the concerns surrounding facial recognition technology, and explain why an outright ban may not be the best course of action.