ICO opens investigation into use of facial recognition in King's Cross

#artificialintelligence

The UK's privacy watchdog has opened an investigation into the use of facial recognition cameras in a busy part of central London. The information commissioner, Elizabeth Denham, announced she would look into the technology being used in Granary Square, close to King's Cross station. Two days ago the mayor of London, Sadiq Khan, wrote to the development's owner demanding to know whether the company believed its use of facial recognition software in its CCTV systems was legal. The Information Commissioner's Office (ICO) said it was "deeply concerned about the growing use of facial recognition technology in public spaces" and was seeking detailed information about how it is used. "Scanning people's faces as they lawfully go about their daily lives in order to identify them is a potential threat to privacy that should concern us all," Denham said.


Police could face legal action over 'authoritarian' facial recognition cameras

Daily Mail - Science & tech

Facial recognition technology used by the UK police is making thousands of mistakes - and now there could be legal repercussions. Civil liberties group, Big Brother Watch, has teamed up with Baroness Jenny Jones to ask the government and the Met to stop using the technology. They claim the use of facial recognition has proven to be'dangerously authoritarian', inaccurate and a breach if rights protecting privacy and freedom of expression. If their request is rejected, the group says it will take the case to court in what will be the first legal challenge of its kind. South Wales Police, London's Met and Leicestershire are all trialling automated facial recognition systems in public places to identify wanted criminals.


Police face legal action over use of facial recognition cameras

The Guardian

Two legal challenges have been launched against police forces in south Wales and London over their use of automated facial recognition (AFR) technology on the grounds the surveillance is unregulated and violates privacy. The claims are backed by the human rights organisations Liberty and Big Brother Watch following complaints about biometric checks at the Notting Hill carnival, on Remembrance Sunday, at demonstrations and in high streets. Liberty is supporting Ed Bridges, a Cardiff resident, who has written to the chief constable of South Wales police alleging he was tracked at a peaceful anti-arms protest and while out shopping. Big Brother Watch is working with the Green party peer Jenny Jones who has written to the home secretary, Sajid Javid, and the Metropolitan police commissioner, Cressida Dick, urging them to halt deployment of the "dangerously authoritarian" technology. If the forces do not stop using AFR systems then legal action will follow in the high court, the letters said.


Police facial recognition system faces legal challenge

BBC News

A legal challenge against the use of automatic facial recognition technology by police has been launched by a civil liberties group. Automatic Facial Recognition uses CCTV or surveillance cameras to record and compare facial characteristics with images on police databases. Lawyers for Big Brother Watch argue the use of AFR breaches the rights of individuals under the Human Rights Act. The Metropolitan Police says the technology will help keep London safe. The system is being piloted in London, with three other forces - Humberside, South Wales, and Leicestershire - also trialling the technology.


Police trial AI software to help process mobile phone evidence

#artificialintelligence

Artificial intelligence software capable of interpreting images, matching faces and analysing patterns of communication is being piloted by UK police forces to speed up examination of mobile phones seized in crime investigations. Cellebrite, the Israeli-founded and now Japanese-owned company behind some of the software, claims a wider rollout would solve problems over failures to disclose crucial digital evidence that have led to the collapse of a series of rape trials and other prosecutions in the past year. However, the move by police has prompted concerns over privacy and the potential for software to introduce bias into processing of criminal evidence. As police and lawyers struggle to cope with the exponential rise in data volumes generated by phones and laptops in even routine crime cases, the hunt is on for a technological solution to handle increasingly unmanageable workloads. Some forces are understood to have backlogs of up to six months for examining downloaded mobile phone contents.