A legal challenge against the use of automatic facial recognition technology by police has been launched by a civil liberties group. Automatic Facial Recognition uses CCTV or surveillance cameras to record and compare facial characteristics with images on police databases. Lawyers for Big Brother Watch argue the use of AFR breaches the rights of individuals under the Human Rights Act. The Metropolitan Police says the technology will help keep London safe. The system is being piloted in London, with three other forces - Humberside, South Wales, and Leicestershire - also trialling the technology.
The accuracy of police facial recognition systems has been criticised by a UK privacy group. Two forces have been testing facial recognition cameras at public events in an effort to catch wanted criminals. Big Brother Watch said its investigation showed the technology was "dangerous and inaccurate" as it had wrongly flagged up a "staggering" number of innocent people as suspects. But police have defended its use and say additional safeguards are in place. Police facial recognition cameras have been trialled at events such as football matches, festivals and parades.
Facial recognition technology used by the UK police is making thousands of mistakes - and now there could be legal repercussions. Civil liberties group, Big Brother Watch, has teamed up with Baroness Jenny Jones to ask the government and the Met to stop using the technology. They claim the use of facial recognition has proven to be'dangerously authoritarian', inaccurate and a breach if rights protecting privacy and freedom of expression. If their request is rejected, the group says it will take the case to court in what will be the first legal challenge of its kind. South Wales Police, London's Met and Leicestershire are all trialling automated facial recognition systems in public places to identify wanted criminals.
The Metropolitan police will start using live facial recognition, Britain's biggest force has announced. The decision to deploy the controversial technology, which has been dogged by privacy concerns and questions over its lawfulness, was immediately condemned by civil liberties groups, who described the move as "a breathtaking assault on our rights". But the Met said that after two years of trials, it was ready to use the cameras within a month. The force said it would deploy the technology overtly and only after consulting communities in which it is to be used. Nick Ephgrave, an assistant commissioner, said: "As a modern police force, I believe that we have a duty to use new technologies to keep people safe in London. Independent research has shown that the public support us in this regard."
Live facial recognition cameras will be deployed across London, with the city's Metropolitan Police announcing today that the technology has moved past the trial stage and is ready to be permanently integrated into everyday policing. The cameras will be placed in locations popular with shoppers and tourists, like Stratford's Westfield shopping center and the West End, reports BBC News. Each camera will scan for faces contained in "bespoke" watch lists, which the Met says will predominantly contain individuals "wanted for serious and violent offences." When the camera flags an individual, police officers will approach and ask them to verify their identity. If they're on the watch list, they'll be arrested.