A legal challenge against the use of automatic facial recognition technology by police has been launched by a civil liberties group. Automatic Facial Recognition uses CCTV or surveillance cameras to record and compare facial characteristics with images on police databases. Lawyers for Big Brother Watch argue the use of AFR breaches the rights of individuals under the Human Rights Act. The Metropolitan Police says the technology will help keep London safe. The system is being piloted in London, with three other forces - Humberside, South Wales, and Leicestershire - also trialling the technology.
Facial recognition technology used by the UK police is making thousands of mistakes - and now there could be legal repercussions. Civil liberties group, Big Brother Watch, has teamed up with Baroness Jenny Jones to ask the government and the Met to stop using the technology. They claim the use of facial recognition has proven to be'dangerously authoritarian', inaccurate and a breach if rights protecting privacy and freedom of expression. If their request is rejected, the group says it will take the case to court in what will be the first legal challenge of its kind. South Wales Police, London's Met and Leicestershire are all trialling automated facial recognition systems in public places to identify wanted criminals.
Facial recognition technology used by the UK police is making thousands of mistakes, a new report has found. South Wales Police, London's Met and Leicestershire are all trialling automated facial recognition systems in public places to identify wanted criminals. According to police figures, the system often makes more incorrect matches than correct ones. Experts warned the technology could lead to false arrests and described it as a'dangerously inaccurate policing tool'. South Wales Police has been testing an automated facial recognition system.