When Wales takes on Ireland in the Six Nations rugby championship Saturday, Big Brother will be watching. Fans filing into the stadium in Cardiff will be scanned with facial recognition software as part of a police trial of the technology. Should any of their faces match a database of potential suspects, officers will be standing by, ready to swoop. It's the kind of indiscriminate mass surveillance that would be expected, in ordinary times, to be the subject of fierce debate in the U.K., as journalists and politicians fought over the proper balance between privacy and security. Instead, trial runs like the one in South Wales are taking place largely unchallenged by parliament.
As the first step on the road to a powerful, high tech surveillance apparatus, it was a little underwhelming: a blue van topped by almost comically intrusive cameras, a few police officers staring intently but ineffectually at their smartphones and a lot of bemused shoppers. As unimpressive as the moment may have been, however, the decision by London's Metropolitan Police to expand its use of live facial recognition (LFR) marks a significant shift in the debate over privacy, security and surveillance in public spaces. Despite dismal accuracy results in earlier trials, the Metropolitan Police Service (MPS) has announced they are pushing ahead with the roll-out of LFR at locations across London. MPS say that cameras will be focused on a small targeted area "where intelligence suggests [they] are most likely to locate serious offenders," and will match faces against a database of individuals wanted by police. The cameras will be accompanied by clear signposting and officers handing out leaflets (it is unclear why MPS thinks that serious offenders would choose to walk through an area full of police officers handing out leaflets to passersby).
Bipartisanship in modern politics can seem kind of like an unbelievable, mythical creature. But in recent months, as Congress considered regulation of one of the most controversial topics it faces -- how, when, or if to use facial recognition -- we've gotten glimpses of a political unicorn. In two House Oversight and Reform committee hearings last summer, some of the most prominent Republicans and Democrats in the United States Congress joined together in calls for legislative reform. Proponents of regulation ranged from Rep. Alexandria Ocasio-Cortez (D-NY) to Rep. Jim Jordan (R-OH), a frequent Trump supporter on cable news. On Friday, Jordan was also appointed to the House Intelligence Committee to confront witnesses in public presidential impeachment hearings that begin this week.
The UK Court of Appeal has unanimously reached a decision against a face-recognition system used by South Wales Police. The judgment, which called the use of automated face recognition (AFR) "unlawful", could have ramifications for the widespread use of such technology across the UK. But there is disagreement about exactly what the consequences will be. Ed Bridges, who initially launched a case after police cameras digitally analysed his face in the street, had appealed, with the support of personal rights campaign group Liberty, against the use of face recognition by police. The police force claimed in court that the technology was similar to the use of closed-circuit television (CCTV) cameras in cities.
The UK's privacy regulator said it is studying the use of controversial facial recognition technology by property companies amid concerns that its use in CCTV systems at the King's Cross development in central London may not be legal. The Information Commissioner's Office warned businesses using the surveillance technology that they needed to demonstrate its use was "strictly necessary and proportionate" and had a clear basis in law. The data protection regulator added it was "currently looking at the use of facial recognition technology" by the private sector and warned it would "consider taking action where we find non-compliance with the law". On Monday, the owners of the King's Cross site confirmed that facial recognition software was used around the 67-acre, 50-building site "in the interest of public safety and to ensure that everyone who visits has the best possible experience". It is one of the first landowners or property companies in Britain to acknowledge deploying the software, described by a human rights pressure group as "authoritarian", partly because it captures images of people without their consent.