Police could face legal action over 'authoritarian' facial recognition cameras

Daily Mail - Science & tech

Facial recognition technology used by the UK police is making thousands of mistakes - and now there could be legal repercussions. Civil liberties group, Big Brother Watch, has teamed up with Baroness Jenny Jones to ask the government and the Met to stop using the technology. They claim the use of facial recognition has proven to be'dangerously authoritarian', inaccurate and a breach if rights protecting privacy and freedom of expression. If their request is rejected, the group says it will take the case to court in what will be the first legal challenge of its kind. South Wales Police, London's Met and Leicestershire are all trialling automated facial recognition systems in public places to identify wanted criminals.


Police facial recognition system faces legal challenge

BBC News

A legal challenge against the use of automatic facial recognition technology by police has been launched by a civil liberties group. Automatic Facial Recognition uses CCTV or surveillance cameras to record and compare facial characteristics with images on police databases. Lawyers for Big Brother Watch argue the use of AFR breaches the rights of individuals under the Human Rights Act. The Metropolitan Police says the technology will help keep London safe. The system is being piloted in London, with three other forces - Humberside, South Wales, and Leicestershire - also trialling the technology.


Facial recognition tech used by UK police is making a ton of mistakes

#artificialintelligence

At the end of each summer for the last 14 years, the small Welsh town of Porthcawl has been invaded. Every year its 16,000 population is swamped by up to 35,000 Elvis fans. Many people attending the yearly festival look the same: they slick back their hair, throw on oversized sunglasses and don white flares. At 2017's Elvis festival, impersonators were faced with something different. Police were trialling automated facial recognition technology to track down criminals.


Facial recognition cameras used by police 'dangerously inaccurate'

Daily Mail - Science & tech

Facial recognition technology used by the UK police is making thousands of mistakes, a new report has found. South Wales Police, London's Met and Leicestershire are all trialling automated facial recognition systems in public places to identify wanted criminals. According to police figures, the system often makes more incorrect matches than correct ones. Experts warned the technology could lead to false arrests and described it as a'dangerously inaccurate policing tool'. South Wales Police has been testing an automated facial recognition system.


The backlash against face recognition has begun – but who will win?

New Scientist

A growing backlash against face recognition suggests the technology has a reached a crucial tipping point, as battles over its use are erupting on numerous fronts. Face-tracking cameras have been trialled in public by at least three UK police forces in the last four years. A court case against one force, South Wales Police, began earlier this week, backed by human rights group Liberty. Ed Bridges, an office worker from Cardiff whose image was captured during a test in 2017, says the technology is an unlawful violation of privacy, an accusation the police force denies. Avoiding the camera's gaze has got others in trouble.