Facial recognition software used by the UK's biggest police force has returned false positives in more than 98 per cent of alerts generated, The Independent can reveal, with the country's biometrics regulator calling it "not yet fit for use". The Metropolitan Police's system has produced 104 alerts of which only two were later confirmed to be positive matches, a freedom of information request showed. In its response the force said it did not consider the inaccurate matches "false positives" because alerts were checked a second time after they occurred. Facial recognition technology scans people in a video feed and compares their images to pictures stored in a reference library or watch list. It has been used at large events like the Notting Hill Carnival and a Six Nations Rugby match.
The last day of January 2019 was sunny, yet bitterly cold in Romford, east London. Shoppers scurrying from retailer to retailer wrapped themselves in winter coats, scarves and hats. The temperature never rose above three degrees Celsius. For police officers positioned next to an inconspicuous blue van, just metres from Romford's Overground station, one man stood out among the thin winter crowds. The man, wearing a beige jacket and blue cap, had pulled his jacket over his face as he moved in the direction of the police officers.
Facial recognition technology used by the UK police is making thousands of mistakes - and now there could be legal repercussions. Civil liberties group, Big Brother Watch, has teamed up with Baroness Jenny Jones to ask the government and the Met to stop using the technology. They claim the use of facial recognition has proven to be'dangerously authoritarian', inaccurate and a breach if rights protecting privacy and freedom of expression. If their request is rejected, the group says it will take the case to court in what will be the first legal challenge of its kind. South Wales Police, London's Met and Leicestershire are all trialling automated facial recognition systems in public places to identify wanted criminals.
One of the most senior policing figures in Wales has warned that the use of facial recognition technology at the country's biggest football derby this weekend could create miscarriages of justice. Arfon Jones, a veteran Welsh police officer and the North Wales police and crime commissioner, has expressed grave concern about the deployment of the surveillance technology at Sunday's clash between Cardiff City and Swansea City. Civil liberties and fan groups have also criticised South Wales police's decision to train cameras on supporters and employ facial recognition on them at the Cardiff City stadium. Jones, who served as a police officer in North Wales for 30 years, described the plans as "disproportionate". He also accused the South Wales force of being engaged in a "fishing expedition where, once again, football fans are being unfairly targeted in a way that supporters of other sports are not".
South Wales police are to have a facial recognition app installed on their phones to identify suspects without having to take them to a police station. The force intends to test the app over the next three months with 50 officers using the technology to confirm the names of people of interest who are stopped on routine patrols. The app will allow officers to run a snapshot of a person through a database of suspects called a watchlist, and find potential matches even if the individual gives false or misleading information. The move is the latest sign that police forces in Britain are eager to embrace the controversial technology which has been criticised for infringing privacy and increasing state powers of surveillance. Liberty, the campaign group, called the announcement "chilling", adding that it was "shameful" that South Wales police had chosen to press ahead with handheld facial recognition systems even as it faced a court challenge over the technology.