King's Cross Central's developers said they wanted facial-recognition software to spot people on the site who had previously committed an offence there. The detail has emerged in a letter one of its managers sent to the London mayor, on 14 August. Sadiq Khan had sought reassurance using facial recognition on the site was legal. Two days before, Argent indicated it was using it to "ensure public safety". On Monday, it said it had now scrapped work on new uses of the technology.
The mayor of London has written to the owner of the King's Cross development demanding to know whether the company believes its use of facial recognition software in its CCTV systems is legal. Sadiq Khan said he wanted to express his concern a day after the property company behind the 27-hectare (67-acre) central London site admitted it was using the technology "in the interests of public safety". In his letter, shared with the Guardian, the Labour mayor writes to Robert Evans, the chief executive of the King's Cross development, to "request more information about exactly how this technology is being used". Khan also asks for "reassurance that you have been liaising with government ministers and the Information Commissioner's Office to ensure its use is fully compliant with the law as it stands". The owner of King's Cross is one of the first property companies to acknowledge it is deploying facial recognition software, even though it has been criticised by human rights group Liberty as "a disturbing expansion of mass surveillance".
Facial recognition technology will not be deployed at the King's Cross development in the future, following a backlash prompted by the site owner's admission last month that the software had been used in its CCTV systems. The developer behind the prestigious central London site said the surveillance software had been used between May 2016 and March 2018 in two cameras on a busy pedestrian street running through its heart. It said it had abandoned plans for a wider deployment across the 67-acre, 50-building site and had "no plans to reintroduce any form of facial recognition technology at the King's Cross Estate". The site became embroiled in the debate about the ethics of facial recognition three weeks ago after releasing a short statement saying its cameras "use a number of detection and tracking methods, including facial recognition". That made it one of the first landowners to acknowledge it was deploying the software, described by human rights groups as authoritarian, partly because it captures and analyses images of people without their consent.
Facial recognition software has become increasingly common in recent years. Facebook uses it to tag your photos; the FBI has a massive facial recognition database spanning hundreds of millions of images; and in New York, there are even plans to add smart, facial recognition surveillance cameras to every bridge and tunnel. But while these systems seem inescapable, the technology that underpins them is far from infallible. In fact, it can be beat with a pair of psychedelic-looking glasses that cost just $0.22. Researchers from Carnegie Mellon University have shown that specially designed spectacle frames can fool even state-of-the-art facial recognition software.