Facial recognition tech: watchdog calls for code to regulate police use

The Guardian

The information commissioner has expressed concern over the lack of a formal legal framework for the use of facial recognition cameras by the police. A barrister for the commissioner, Elizabeth Denham, told a court the current guidelines around automated facial recognition (AFR) technology were "ad hoc" and a clear code was needed. In a landmark case, Ed Bridges, an office worker from Cardiff, claims South Wales police violated his privacy and data protection rights by using AFR on him when he went to buy a sandwich during his lunch break and when he attended a peaceful anti-arms demonstration. The technology maps faces in a crowd and then compares them with a watchlist of images, which can include suspects, missing people or persons of interest to the police. The cameras have been used to scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival.


Office worker launches UK's first police facial recognition legal action

The Guardian

An office worker who believes his image was captured by facial recognition cameras when he popped out for a sandwich in his lunch break has launched a groundbreaking legal battle against the use of the technology. Supported by the campaign group Liberty, Ed Bridges, from Cardiff, raised money through crowdfunding to pursue the action, claiming the suspected use of the technology on him by South Wales police was an unlawful violation of privacy. Bridges, 36, claims he was distressed by the apparent use of the technology and is also arguing during a three-day hearing at Cardiff civil justice and family centre that it breaches data protection and equality laws. Facial recognition technology maps faces in a crowd and then compares them to a watchlist of images, which can include suspects, missing people and persons of interest to the police. The cameras scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival.


UK's controversial use of face recognition to be challenged in court

New Scientist

The first legal battle in the UK over police use of face recognition technology will begin today. Ed Bridges has crowdfunded action against South Wales Police over claims that the use of the technology on him was an unlawful violation of privacy. He will also argue it breaches data protection and equality laws during a three-day hearing at Cardiff Civil Justice and Family Centre. Face recognition technology maps faces in a crowd then compares results with a "watch list" of images which can include suspects, missing people and persons of interest. Police who have trialled the technology hope it can help tackle crime but campaigners argue it breaches privacy and civil liberty.


Police face legal action over use of facial recognition cameras

The Guardian

Two legal challenges have been launched against police forces in south Wales and London over their use of automated facial recognition (AFR) technology on the grounds the surveillance is unregulated and violates privacy. The claims are backed by the human rights organisations Liberty and Big Brother Watch following complaints about biometric checks at the Notting Hill carnival, on Remembrance Sunday, at demonstrations and in high streets. Liberty is supporting Ed Bridges, a Cardiff resident, who has written to the chief constable of South Wales police alleging he was tracked at a peaceful anti-arms protest and while out shopping. Big Brother Watch is working with the Green party peer Jenny Jones who has written to the home secretary, Sajid Javid, and the Metropolitan police commissioner, Cressida Dick, urging them to halt deployment of the "dangerously authoritarian" technology. If the forces do not stop using AFR systems then legal action will follow in the high court, the letters said.


Facial recognition must not introduce gender or racial bias, police told

The Guardian

Facial recognition software should only be used by police if they can prove it will not introduce gender or racial bias to operations, an ethics panel has said. A report by the London policing ethics panel, which was set up to advise City Hall, concluded that while there were "important ethical issues to be addressed" in the use of the controversial technology, they did not justify not using it at all. Live facial recognition (LFR) technology is designed to check people passing a camera in a public place against images on police databases, which can include suspects, missing people or persons of interest to the police. The technology has been used to scan faces in large crowds in public places such as streets and shopping centres, and in football crowds and at events such as the Notting Hill carnival. The Metropolitan police have carried out 10 trials using the technology across London, the most recent being in Romford town centre in mid-February.