Facial recognition technology scrapped at King's Cross site

The Guardian

Facial recognition technology will not be deployed at the King's Cross development in the future, following a backlash prompted by the site owner's admission last month that the software had been used in its CCTV systems. The developer behind the prestigious central London site said the surveillance software had been used between May 2016 and March 2018 in two cameras on a busy pedestrian street running through its heart. It said it had abandoned plans for a wider deployment across the 67-acre, 50-building site and had "no plans to reintroduce any form of facial recognition technology at the King's Cross Estate". The site became embroiled in the debate about the ethics of facial recognition three weeks ago after releasing a short statement saying its cameras "use a number of detection and tracking methods, including facial recognition". That made it one of the first landowners to acknowledge it was deploying the software, described by human rights groups as authoritarian, partly because it captures and analyses images of people without their consent.


London mayor writes to King's Cross owner over facial recognition

The Guardian

The mayor of London has written to the owner of the King's Cross development demanding to know whether the company believes its use of facial recognition software in its CCTV systems is legal. Sadiq Khan said he wanted to express his concern a day after the property company behind the 27-hectare (67-acre) central London site admitted it was using the technology "in the interests of public safety". In his letter, shared with the Guardian, the Labour mayor writes to Robert Evans, the chief executive of the King's Cross development, to "request more information about exactly how this technology is being used". Khan also asks for "reassurance that you have been liaising with government ministers and the Information Commissioner's Office to ensure its use is fully compliant with the law as it stands". The owner of King's Cross is one of the first property companies to acknowledge it is deploying facial recognition software, even though it has been criticised by human rights group Liberty as "a disturbing expansion of mass surveillance".


ICO opens investigation into use of facial recognition in King's Cross

#artificialintelligence

The UK's privacy watchdog has opened an investigation into the use of facial recognition cameras in a busy part of central London. The information commissioner, Elizabeth Denham, announced she would look into the technology being used in Granary Square, close to King's Cross station. Two days ago the mayor of London, Sadiq Khan, wrote to the development's owner demanding to know whether the company believed its use of facial recognition software in its CCTV systems was legal. The Information Commissioner's Office (ICO) said it was "deeply concerned about the growing use of facial recognition technology in public spaces" and was seeking detailed information about how it is used. "Scanning people's faces as they lawfully go about their daily lives in order to identify them is a potential threat to privacy that should concern us all," Denham said.


Facial recognition row: police gave King's Cross owner images of seven people

The Guardian

Images of seven people were passed on by local police for use in a facial recognition system at King's Cross in London in an agreement that was struck in secret, the details of which were made public for the first time today. A police report, published by the deputy London mayor Sophie Linden on Friday, showed that the scheme ran for two years from 2016 without any apparent central oversight from either the Metropolitan police or the office of the mayor, Sadiq Khan. Writing to London assembly members, Linden said she "wanted to pass on the [Metropolitan police service's] apology" for failing to previously disclose that the scheme existed and announced that similar local image sharing agreements were now banned. There had been "no other examples of images having been shared with private companies for facial recognition purposes" by the Met, Linden said, according to "the best of its knowledge and record-keeping". The surveillance scheme – controversial because it involved tracking individuals without their consent – was originally agreed between borough police in Camden and the owner of the 27-hectare King's Cross site in 2016.


UK looks the other way on AI

#artificialintelligence

When Wales takes on Ireland in the Six Nations rugby championship Saturday, Big Brother will be watching. Fans filing into the stadium in Cardiff will be scanned with facial recognition software as part of a police trial of the technology. Should any of their faces match a database of potential suspects, officers will be standing by, ready to swoop. It's the kind of indiscriminate mass surveillance that would be expected, in ordinary times, to be the subject of fierce debate in the U.K., as journalists and politicians fought over the proper balance between privacy and security. Instead, trial runs like the one in South Wales are taking place largely unchallenged by parliament.