Goto

Collaborating Authors

Results


Court finds some fault with UK police force's use of facial recognition tech – TechCrunch

#artificialintelligence

Civil rights campaigners in the UK have won a legal challenge to South Wales Police's (SWP) use of facial recognition technology. The win on appeal is being hailed as a "world-first" victory in the fight against the use of an "oppressive surveillance tool", as human rights group Liberty puts it. However the police force does not intend to appeal the ruling -- and has said it remains committed to "careful" use of the tech. The back story here is SWP has been trialing automated facial recognition (AFR) technology since 2017, deploying a system known as AFR Locate on around 50 occasions between May 2017 and April 2019 at a variety of public events in Wales. The force has used the technology in conjunction with watchlists of between 400-800 people -- which included persons wanted on warrants; persons who had escaped from custody; persons suspected of having committed crimes; persons who may be in need of protection; vulnerable persons; persons of possible interest to it for intelligence purposes; and persons whose presence at a particular event causes particular concern, per a press summary issued by the appeals court.


Police use of facial recognition gets reined in by UK court - CNET

CNET - News

A close-up of a police facial recognition camera used in Cardiff, Wales. Since 2017, police in the UK have been testing live, or real-time, facial recognition in public places to try to identify criminals. The legality of these trials has been widely questioned by privacy and human rights campaigners, who just won a landmark case that could have a lasting impact on how police use the technology in the future. In a ruling Tuesday, the UK Court of Appeal said South Wales Police had been using the technology unlawfully, which amounted to a violation of human rights. In a case brought by civil liberties campaigner Ed Bridges and supported by human rights group Liberty, three senior judges ruled that the South Wales Police had violated Bridges' right to privacy under the European Convention of Human Rights.


UK court rules police facial recognition trials violate privacy laws

Engadget

Human rights organization Liberty is claiming a win in its native Britain after a court ruled that police trials of facial recognition technology violated privacy laws. The Court of Appeal ruled that the use of automatic facial recognition systems unfairly impacted claimant Ed Bridges' right to a private life. Judges added that there were issues around how people's personal data was being processed, and said that the trials should be halted for now. The court also found that the South Wales Police (SWP) had not done enough to satisfy itself that facial recognition technology was not unbiased. A spokesperson for SWP told the BBC that it would not be appealing the judgment, but Chief Constable Matt Jukes said that the force will find a way to "work with" the judgment.


Police Use of Facial Recognition Is Accepted by British Court

#artificialintelligence

In one of the first lawsuits to address the use of live facial recognition technology by governments, a British court ruled on Wednesday that police use of the systems is acceptable and does not violate privacy and human rights. The case has been closely watched by law enforcement agencies, privacy groups and government officials because there is little legal precedent concerning the use of cameras in public spaces that scan people's faces in real time and attempt to identify them from photo databases of criminal suspects. While the technology has advanced quickly, with many companies building systems that can be used by police departments, laws and regulations have been slower to develop. The High Court dismissed the case brought by Ed Bridges, a resident of Cardiff, Wales, who said his rights were violated by the use of facial recognition by the South Wales Police. Mr. Bridges claimed that he had been recorded without permission on at least two occasions -- once while shopping and again while attending a political rally.


UK court backs police use of face recognition, but fight isn't over

New Scientist

A man from Cardiff, UK, says the police breached his human rights when they used facial recognition technology, but today a court ruled that the police's actions were lawful. That is, however, hardly the end of the matter. South Wales Police has been trialling automated facial recognition (AFR) technology since April 2017. Other forces around the country are trialling similar systems, including London's Metropolitan Police. Bridges may have been snapped during a pilot called AFR Locate.


Police facial recognition system faces legal challenge

BBC News

A legal challenge against the use of automatic facial recognition technology by police has been launched by a civil liberties group. Automatic Facial Recognition uses CCTV or surveillance cameras to record and compare facial characteristics with images on police databases. Lawyers for Big Brother Watch argue the use of AFR breaches the rights of individuals under the Human Rights Act. The Metropolitan Police says the technology will help keep London safe. The system is being piloted in London, with three other forces - Humberside, South Wales, and Leicestershire - also trialling the technology.


Police could face legal action over 'authoritarian' facial recognition cameras

Daily Mail - Science & tech

Facial recognition technology used by the UK police is making thousands of mistakes - and now there could be legal repercussions. Civil liberties group, Big Brother Watch, has teamed up with Baroness Jenny Jones to ask the government and the Met to stop using the technology. They claim the use of facial recognition has proven to be'dangerously authoritarian', inaccurate and a breach if rights protecting privacy and freedom of expression. If their request is rejected, the group says it will take the case to court in what will be the first legal challenge of its kind. South Wales Police, London's Met and Leicestershire are all trialling automated facial recognition systems in public places to identify wanted criminals.


Police face legal action over use of facial recognition cameras

The Guardian

Two legal challenges have been launched against police forces in south Wales and London over their use of automated facial recognition (AFR) technology on the grounds the surveillance is unregulated and violates privacy. The claims are backed by the human rights organisations Liberty and Big Brother Watch following complaints about biometric checks at the Notting Hill carnival, on Remembrance Sunday, at demonstrations and in high streets. Liberty is supporting Ed Bridges, a Cardiff resident, who has written to the chief constable of South Wales police alleging he was tracked at a peaceful anti-arms protest and while out shopping. Big Brother Watch is working with the Green party peer Jenny Jones who has written to the home secretary, Sajid Javid, and the Metropolitan police commissioner, Cressida Dick, urging them to halt deployment of the "dangerously authoritarian" technology. If the forces do not stop using AFR systems then legal action will follow in the high court, the letters said.


Facial recognition tech used by UK police is making a ton of mistakes

#artificialintelligence

At the end of each summer for the last 14 years, the small Welsh town of Porthcawl has been invaded. Every year its 16,000 population is swamped by up to 35,000 Elvis fans. Many people attending the yearly festival look the same: they slick back their hair, throw on oversized sunglasses and don white flares. At 2017's Elvis festival, impersonators were faced with something different. Police were trialling automated facial recognition technology to track down criminals.


Facial recognition cameras used by police 'dangerously inaccurate'

Daily Mail - Science & tech

Facial recognition technology used by the UK police is making thousands of mistakes, a new report has found. South Wales Police, London's Met and Leicestershire are all trialling automated facial recognition systems in public places to identify wanted criminals. According to police figures, the system often makes more incorrect matches than correct ones. Experts warned the technology could lead to false arrests and described it as a'dangerously inaccurate policing tool'. South Wales Police has been testing an automated facial recognition system.