Images of seven people were passed on by local police for use in a facial recognition system at King's Cross in London in an agreement that was struck in secret, the details of which were made public for the first time today. A police report, published by the deputy London mayor Sophie Linden on Friday, showed that the scheme ran for two years from 2016 without any apparent central oversight from either the Metropolitan police or the office of the mayor, Sadiq Khan. Writing to London assembly members, Linden said she "wanted to pass on the [Metropolitan police service's] apology" for failing to previously disclose that the scheme existed and announced that similar local image sharing agreements were now banned. There had been "no other examples of images having been shared with private companies for facial recognition purposes" by the Met, Linden said, according to "the best of its knowledge and record-keeping". The surveillance scheme – controversial because it involved tracking individuals without their consent – was originally agreed between borough police in Camden and the owner of the 27-hectare King's Cross site in 2016.
The UK's privacy regulator said it is studying the use of controversial facial recognition technology by property companies amid concerns that its use in CCTV systems at the King's Cross development in central London may not be legal. The Information Commissioner's Office warned businesses using the surveillance technology that they needed to demonstrate its use was "strictly necessary and proportionate" and had a clear basis in law. The data protection regulator added it was "currently looking at the use of facial recognition technology" by the private sector and warned it would "consider taking action where we find non-compliance with the law". On Monday, the owners of the King's Cross site confirmed that facial recognition software was used around the 67-acre, 50-building site "in the interest of public safety and to ensure that everyone who visits has the best possible experience". It is one of the first landowners or property companies in Britain to acknowledge deploying the software, described by a human rights pressure group as "authoritarian", partly because it captures images of people without their consent.
The UK's privacy watchdog has opened an investigation into the use of facial recognition cameras in a busy part of central London. The information commissioner, Elizabeth Denham, announced she would look into the technology being used in Granary Square, close to King's Cross station. Two days ago the mayor of London, Sadiq Khan, wrote to the development's owner demanding to know whether the company believed its use of facial recognition software in its CCTV systems was legal. The Information Commissioner's Office (ICO) said it was "deeply concerned about the growing use of facial recognition technology in public spaces" and was seeking detailed information about how it is used. "Scanning people's faces as they lawfully go about their daily lives in order to identify them is a potential threat to privacy that should concern us all," Denham said.
Facial recognition technology will not be deployed at the King's Cross development in the future, following a backlash prompted by the site owner's admission last month that the software had been used in its CCTV systems. The developer behind the prestigious central London site said the surveillance software had been used between May 2016 and March 2018 in two cameras on a busy pedestrian street running through its heart. It said it had abandoned plans for a wider deployment across the 67-acre, 50-building site and had "no plans to reintroduce any form of facial recognition technology at the King's Cross Estate". The site became embroiled in the debate about the ethics of facial recognition three weeks ago after releasing a short statement saying its cameras "use a number of detection and tracking methods, including facial recognition". That made it one of the first landowners to acknowledge it was deploying the software, described by human rights groups as authoritarian, partly because it captures and analyses images of people without their consent.
King's Cross Central's developers said they wanted facial-recognition software to spot people on the site who had previously committed an offence there. The detail has emerged in a letter one of its managers sent to the London mayor, on 14 August. Sadiq Khan had sought reassurance using facial recognition on the site was legal. Two days before, Argent indicated it was using it to "ensure public safety". On Monday, it said it had now scrapped work on new uses of the technology.