Goto

Collaborating Authors

Face Recognition


In China, facial recognition, public shaming and control go hand in hand

#artificialintelligence

A screen shows a demonstration of SenseTime Group's SenseVideo pedestrian and vehicle recognition system at the company's showroom in Beijing. Facial recognition supporters in the US often argue that the surveillance technology is reserved for the greatest risks -- to help deal with violent crimes, terrorist threats and human trafficking. And while it's still often used for petty crimes like shoplifting, stealing $12 worth of goods or selling $50 worth of drugs, its use in the US still looks tame compared with how widely deployed facial recognition has been in China. A database leak in 2019 gave a glimpse of how pervasive China's surveillance tools are -- with more than 6.8 million records from a single day, taken from cameras positioned around hotels, parks, tourism spots and mosques, logging details on people as young as 9 days old. The Chinese government is accused of using facial recognition to commit atrocities against Uyghur Muslims, relying on the technology to carry out "the largest mass incarceration of a minority population in the world today."


Is police use of face recognition now illegal in the UK?

New Scientist

The UK Court of Appeal has unanimously reached a decision against a face-recognition system used by South Wales Police. The judgment, which called the use of automated face recognition (AFR) "unlawful", could have ramifications for the widespread use of such technology across the UK. But there is disagreement about exactly what the consequences will be. Ed Bridges, who initially launched a case after police cameras digitally analysed his face in the street, had appealed, with the support of personal rights campaign group Liberty, against the use of face recognition by police. The police force claimed in court that the technology was similar to the use of closed-circuit television (CCTV) cameras in cities.


In China, facial recognition, public shaming and control go hand in hand - CNET

CNET - News

A screen shows a demonstration of SenseTime Group's SenseVideo pedestrian and vehicle recognition system at the company's showroom in Beijing. Facial recognition supporters in the US often argue that the surveillance technology is reserved for the greatest risks -- to help deal with violent crimes, terrorist threats and human trafficking. And while it's still often used for petty crimes like shoplifting, stealing $12 worth of goods or selling $50 worth of drugs, its use in the US still looks tame compared with how widely deployed facial recognition has been in China. A database leak in 2019 gave a glimpse of how pervasive China's surveillance tools are -- with more than 6.8 million records from a single day, taken from cameras positioned around hotels, parks, tourism spots and mosques, logging details on people as young as 9 days old. The Chinese government is accused of using facial recognition to commit atrocities against Uyghur Muslims, relying on the technology to carry out "the largest mass incarceration of a minority population in the world today."


South Wales police lose landmark facial recognition case

The Guardian

The use of facial recognition technology by South Wales police broke race and sex equalities law and breached privacy rights because the force did not apply proper safeguards, the court of appeal has ruled. The critical judgment came in a case brought by Ed Bridges, a civil liberties campaigner, who was scanned by the police software in Cardiff in 2017 and 2018. He argued that capturing of thousands of faces was indiscriminate. Bridges' case had previously been rejected by the high court, but the court of appeal ruled in his favour on three counts, in a significant test case for how the controversial technology is applied in practice by police. But the appeal court held that Bridges' right to privacy, under article 8 of the European convention on human rights, was breached because there was "too broad a discretion" left to police officers as to who to put on its watchlist of suspects.


UK court rules police facial recognition trials violate privacy laws

Engadget

Human rights organization Liberty is claiming a win in its native Britain after a court ruled that police trials of facial recognition technology violated privacy laws. The Court of Appeal ruled that the use of automatic facial recognition systems unfairly impacted claimant Ed Bridges' right to a private life. Judges added that there were issues around how people's personal data was being processed, and said that the trials should be halted for now. The court also found that the South Wales Police (SWP) had not done enough to satisfy itself that facial recognition technology was not unbiased. A spokesperson for SWP told the BBC that it would not be appealing the judgment, but Chief Constable Matt Jukes said that the force will find a way to "work with" the judgment.


Facial recognition use by South Wales Police ruled unlawful

BBC News

The use of automatic facial recognition (AFR) technology by South Wales Police is unlawful, the Court of Appeal has ruled. It follows a legal challenge brought by civil rights group Liberty and Ed Bridges, 37, from Cardiff. But the court also found its use was proportionate interference with human rights as the benefits outweighed the impact on Mr Bridges. South Wales Police said it would not be appealing the findings. Mr Bridges had said being identified by AFR caused him distress.


Michigan University study advocates ban of facial recognition in schools

#artificialintelligence

A newly published study by University of Michigan researchers shows facial recognition technology in schools presents multiple problems and has limited efficacy. Led by Shobita Parthasarathy, director of the university's Science, Technology, and Public Policy (STPP) program, the research say the technology isn't suited to security purposes and can actively promote racial discrimination, normalize surveillance, and erode privacy while institutionalizing inaccuracy and marginalizing non-conforming students. The study follows the New York legislature's passage of a moratorium on the use of facial recognition and other forms of biometric identification in schools until 2022. The bill, which came in response to the launch of facial recognition by the Lockport City School District, was among the first in the nation to explicitly regulate or ban use of the technology in schools. That development came after companies including Amazon, IBM, and Microsoft halted or ended the sale of facial recognition products in response to the first wave of Black Lives Matter protests in the U.S. The Michigan University study -- a part of STPP's Technology Assessment Project -- employs an analogical case comparison method to look at previous uses of security technology like CCTV cameras and metal detectors as well as biometric technologies and anticipate the implications of facial recognition.


US police's facial recognition systems misidentify Black people

Al Jazeera

It has been more than two months since the killing of George Floyd at the hands of police in the United States. And as protests continue - the message is no longer just about specific incidents of violence, but about what demonstrators say is systemic racism in policing. One of the most obvious examples is the widespread use of facial recognition systems that have been proven to misidentify people of colour.


Step By Step Guide To Stabilize Facial Landmarks In A Video Using Dlib

#artificialintelligence

The human face has been a topic of interest for deep learning engineers for quite some time now. Understanding the human face not only helps in facial recognition but finds applications in facial morphing, head pose detection and virtual makeovers. If you are a regular user of social media apps like Instagram or Snapchat, have you wondered how the filters fit perfectly for each face? Though every face on the planet is unique, these filters seem to magically align on your nose, lips and eyes. These filters or face-swapping applications make use of facial landmarks.


AI named after V For Vendetta masks protects photos from being gathered by facial recognition apps

Daily Mail - Science & tech

Clearview AI is just one of many facial recognition firms scraping billions of online images to create a massive database for purchase – but a new program could block their efforts. Researchers designed an image clocking tool that makes subtle pixel-level changes that distort pictures enough so they cannot be used by online scrapers – and claims it is 100 percent effective. Named in honor of the'V for Vendetta' mask, Fawkes is an algorithm and software combination that'cloaks' an image to trick systems, which is like adding an invisible mask to your face. These altered pictures teach technologies a distorted version of the subject and when presented with an'uncloaked' form, the scraping app fails to recognize the individual. 'It might surprise some to learn that we started the Fawkes project a while before the New York Times article that profiled Clearview.ai in February 2020,' researchers from the SANLab at University of Chicago shared in a statement.