Police use of facial recognition is legal, Cardiff high court rules

The Guardian

Police use of automatic facial recognition technology to search for people in crowds is lawful, the high court in Cardiff has ruled. Although the mass surveillance system interferes with the privacy rights of those scanned by security cameras, a judge has concluded, it is not illegal. The legal challenge was brought by Ed Bridges, a former Liberal Democrat councillor from Cardiff, who noticed the cameras when he went out to buy a lunchtime sandwich. He was supported by the human rights organisation Liberty. Bridges said he was distressed by police use of the technology, which he believes captured his image while out shopping and later at a peaceful protest against the arms trade.


UK privacy activist to appeal after facial recognition case fails UK News

#artificialintelligence

British privacy activist Ed Bridges is set to appeal a landmark ruling that endorses the "sinister" use of facial recognition technology by the police to hunt for suspects. In what is believed to be the world's first case of its kind, Bridges told the High Court in Wales that the local police breached his rights by scanning his face without consent. "This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance," Bridges said in a statement. But judges said the police's use of facial recognition technology was lawful and legally justified. Civil rights group Liberty, which represented 36-year-old Bridges, said it would appeal the "disappointing" decision, while police chiefs said they understood the fears of the public.


Metropolitan Police's facial recognition technology 98% inaccurate, figures show

The Independent - Tech

Facial recognition software used by the UK's biggest police force has returned false positives in more than 98 per cent of alerts generated, The Independent can reveal, with the country's biometrics regulator calling it "not yet fit for use". The Metropolitan Police's system has produced 104 alerts of which only two were later confirmed to be positive matches, a freedom of information request showed. In its response the force said it did not consider the inaccurate matches "false positives" because alerts were checked a second time after they occurred. Facial recognition technology scans people in a video feed and compares their images to pictures stored in a reference library or watch list. It has been used at large events like the Notting Hill Carnival and a Six Nations Rugby match.


Facial recognition tech: watchdog calls for code to regulate police use

The Guardian

The information commissioner has expressed concern over the lack of a formal legal framework for the use of facial recognition cameras by the police. A barrister for the commissioner, Elizabeth Denham, told a court the current guidelines around automated facial recognition (AFR) technology were "ad hoc" and a clear code was needed. In a landmark case, Ed Bridges, an office worker from Cardiff, claims South Wales police violated his privacy and data protection rights by using AFR on him when he went to buy a sandwich during his lunch break and when he attended a peaceful anti-arms demonstration. The technology maps faces in a crowd and then compares them with a watchlist of images, which can include suspects, missing people or persons of interest to the police. The cameras have been used to scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival.


Police fear bias in use of artificial intelligence to fight crime

#artificialintelligence

British police officers are among those concerned that the use of artificial intelligence in fighting crime is raising the risk of profiling bias, according to a report commissioned by government officials. The paper warned that algorithms might judge people from disadvantaged backgrounds as "a greater risk" since they were more likely to have contact with public services, thus generating more data that in turn could be used to train the AI. "Police officers themselves are concerned about the lack of safeguards and oversight regarding the use of algorithms in fighting crime," researchers from the defence think-tank the Royal United Services Institute said. The report acknowledged that emerging technology including facial recognition had "many potential benefits". But it warned that assessment of long-term risks was "often missing".