Goto

Collaborating Authors

Vision


In China, facial recognition, public shaming and control go hand in hand

#artificialintelligence

A screen shows a demonstration of SenseTime Group's SenseVideo pedestrian and vehicle recognition system at the company's showroom in Beijing. Facial recognition supporters in the US often argue that the surveillance technology is reserved for the greatest risks -- to help deal with violent crimes, terrorist threats and human trafficking. And while it's still often used for petty crimes like shoplifting, stealing $12 worth of goods or selling $50 worth of drugs, its use in the US still looks tame compared with how widely deployed facial recognition has been in China. A database leak in 2019 gave a glimpse of how pervasive China's surveillance tools are -- with more than 6.8 million records from a single day, taken from cameras positioned around hotels, parks, tourism spots and mosques, logging details on people as young as 9 days old. The Chinese government is accused of using facial recognition to commit atrocities against Uyghur Muslims, relying on the technology to carry out "the largest mass incarceration of a minority population in the world today."


The newest Ring doorbell comes with better features, and it's on sale

Mashable

TL;DR: Keep up with everything that goes on at your door with the Ring Video Doorbell 3 for $179.99, a $20 savings as of Aug. 11. We've all come to cherish our homes in all new ways throughout these past few months, so it's only right that we take care of them to the best of our ability. If you don't already own a home security system, now's the time. If setting up cameras in different spots and at different angles sounds a bit too complicated to you, you may want to consider the Ring Doorbell 3. You'll be able to see, hear, and even speak to anyone who approaches your home without the messy setup process. Thanks to its wireless design, mounting the smart doorbell to your house takes just a few minutes.


To Catch a Poacher: How Our Engineers Brought AI Tech to the Fight Against the Illegal Wildlife Trade

#artificialintelligence

In the wildlife reserves of East Africa, elephants, rhinos, gorillas, and other large mammals are hunted by poachers. All that stands between these animals and harm's way are small teams of park rangers and conservationists. The danger is very real for these species on the brink: A staggering 35,000 African elephants are killed each year, putting them just a decade away from extinction, according to the non-profit RESOLVE. Technology is an increasingly critical tool for protecting elephants and other large animals, given their necessarily expansive habitats: A group of just 50 rangers in Kenya, for example, covers a reserve of 3,000 square miles. Park rangers and conservationists have used motion-activated camera traps to catch poachers in action, but the animals are tragically already lost by the time rangers can respond.


Is police use of face recognition now illegal in the UK?

New Scientist

The UK Court of Appeal has unanimously reached a decision against a face-recognition system used by South Wales Police. The judgment, which called the use of automated face recognition (AFR) "unlawful", could have ramifications for the widespread use of such technology across the UK. But there is disagreement about exactly what the consequences will be. Ed Bridges, who initially launched a case after police cameras digitally analysed his face in the street, had appealed, with the support of personal rights campaign group Liberty, against the use of face recognition by police. The police force claimed in court that the technology was similar to the use of closed-circuit television (CCTV) cameras in cities.


In China, facial recognition, public shaming and control go hand in hand - CNET

CNET - News

A screen shows a demonstration of SenseTime Group's SenseVideo pedestrian and vehicle recognition system at the company's showroom in Beijing. Facial recognition supporters in the US often argue that the surveillance technology is reserved for the greatest risks -- to help deal with violent crimes, terrorist threats and human trafficking. And while it's still often used for petty crimes like shoplifting, stealing $12 worth of goods or selling $50 worth of drugs, its use in the US still looks tame compared with how widely deployed facial recognition has been in China. A database leak in 2019 gave a glimpse of how pervasive China's surveillance tools are -- with more than 6.8 million records from a single day, taken from cameras positioned around hotels, parks, tourism spots and mosques, logging details on people as young as 9 days old. The Chinese government is accused of using facial recognition to commit atrocities against Uyghur Muslims, relying on the technology to carry out "the largest mass incarceration of a minority population in the world today."


South Wales police lose landmark facial recognition case

The Guardian

The use of facial recognition technology by South Wales police broke race and sex equalities law and breached privacy rights because the force did not apply proper safeguards, the court of appeal has ruled. The critical judgment came in a case brought by Ed Bridges, a civil liberties campaigner, who was scanned by the police software in Cardiff in 2017 and 2018. He argued that capturing of thousands of faces was indiscriminate. Bridges' case had previously been rejected by the high court, but the court of appeal ruled in his favour on three counts, in a significant test case for how the controversial technology is applied in practice by police. But the appeal court held that Bridges' right to privacy, under article 8 of the European convention on human rights, was breached because there was "too broad a discretion" left to police officers as to who to put on its watchlist of suspects.


UK court rules police facial recognition trials violate privacy laws

Engadget

Human rights organization Liberty is claiming a win in its native Britain after a court ruled that police trials of facial recognition technology violated privacy laws. The Court of Appeal ruled that the use of automatic facial recognition systems unfairly impacted claimant Ed Bridges' right to a private life. Judges added that there were issues around how people's personal data was being processed, and said that the trials should be halted for now. The court also found that the South Wales Police (SWP) had not done enough to satisfy itself that facial recognition technology was not unbiased. A spokesperson for SWP told the BBC that it would not be appealing the judgment, but Chief Constable Matt Jukes said that the force will find a way to "work with" the judgment.


Facial recognition use by South Wales Police ruled unlawful

BBC News

The use of automatic facial recognition (AFR) technology by South Wales Police is unlawful, the Court of Appeal has ruled. It follows a legal challenge brought by civil rights group Liberty and Ed Bridges, 37, from Cardiff. But the court also found its use was proportionate interference with human rights as the benefits outweighed the impact on Mr Bridges. South Wales Police said it would not be appealing the findings. Mr Bridges had said being identified by AFR caused him distress.


Models Trained to Keep the Trains Running

#artificialintelligence

Steady advances in machine vision techniques such as convolutional neural networks powered by graphics processors and emerging technologies like neuromorphic silicon retina "event cameras" are creating a range of new predictive monitoring and maintenance use cases. We've reported on several, including using machine vision systems to help utilities monitor transmission lines and towers linked to wildfires in California. Now, AI software vendor Ignitarium and partner AVerMedia, an image capture and video transmission specialist, have expanded deployment an aircraft-based platform for detecting railway track obstructions. The AI-based visual "defect detection" platform incorporates Ignitarium's AI software implemented on Nvidia's edge AI platform used to automatically control onboard cameras. The system is designed to keep cameras focused on the track center during airborne inspections.


Michigan University study advocates ban of facial recognition in schools

#artificialintelligence

A newly published study by University of Michigan researchers shows facial recognition technology in schools presents multiple problems and has limited efficacy. Led by Shobita Parthasarathy, director of the university's Science, Technology, and Public Policy (STPP) program, the research say the technology isn't suited to security purposes and can actively promote racial discrimination, normalize surveillance, and erode privacy while institutionalizing inaccuracy and marginalizing non-conforming students. The study follows the New York legislature's passage of a moratorium on the use of facial recognition and other forms of biometric identification in schools until 2022. The bill, which came in response to the launch of facial recognition by the Lockport City School District, was among the first in the nation to explicitly regulate or ban use of the technology in schools. That development came after companies including Amazon, IBM, and Microsoft halted or ended the sale of facial recognition products in response to the first wave of Black Lives Matter protests in the U.S. The Michigan University study -- a part of STPP's Technology Assessment Project -- employs an analogical case comparison method to look at previous uses of security technology like CCTV cameras and metal detectors as well as biometric technologies and anticipate the implications of facial recognition.