Goto

Collaborating Authors

 met




UK police fail to use facial recognition ethically and legally, study finds

Engadget

Use of live facial recognition (LFR) by UK police forces "fail[s] to meet the minimum ethical and legal standards," according to a study from the University of Cambridge. After analyzing LFR use by the Metropolitan (Met) and South Wales police, researchers concluded that the technology should be banned for use in "all public spaces." LFR pairs faces captured by security cameras to database photos to find matches. China and other non-democratic regimes have used the technology to as part of their state surveillance tools. UK police have been testing its use in multiple situations to fight crime and terrorism.


UK police use of live facial recognition unlawful and unethical, report finds

The Guardian

Police should be banned from using live facial recognition technology in all public spaces because they are breaking ethical standards and human rights laws, a study has concluded. LFR involves linking cameras to databases containing photos of people. Images from the cameras can then be checked against those photos to see if they match. British police have experimented with the technology, believing it can help combat crime and terrorism. But in some cases, courts have found against the way police have used LFR, and how they have dealt with infringements of the privacy rights of people walking in the streets where the technology has been used.


Snooping on the police: can AI clean up the Met? - Raconteur

#artificialintelligence

Shamed and appalled by the brutal murder of Sarah Everard at the hands of a serving officer, the British public demanded a swift response from the Metropolitan Police Service. A subsequent review into the conduct of officers based at Charing Cross in London unearthed a toxic environment where colleagues bonded over jokes about rape, killing black children and beating their wives. Heads had to roll, starting with the former Met Police Service commissioner Dame Cressida Dick. The poor handling of the Everard case did little to assuage conclusions by its own watchdog that the Met is "systematically and institutionally corrupt". Inspector of Constabulary Matt Parr said that the Met had "sometimes behaved in ways that make it appear arrogant, secretive and lethargic" in response to investigations into dirty cops, and that it did "not have the capability to proactively monitor" communications with any effect, "despite repeated warnings from the inspectorate".


Police are failing to consult the public about their use of AI, charity warns

#artificialintelligence

The police are failing to consult the public about their growing use of technologies including artificially-intelligence facial recognition and automated decision systems (ADS), a charity has warned. South Wales Police is the only police force in the UK known to be using AI in its policing to have confirmed it consulted with its local communities about its use, according to a report from The Royal Society for the encouragement of Arts, Manufactures and Commerce (RSA). A Freedom of Information request returned in March found that London's Metropolitan Police Force, which began using live facial recognition tech in February following years of trials, had no record of consulting the public, despite suggesting that this would take place alongside deployment. The Met's software is deployed through signposted cameras focused on small areas to scan the faces of passers-by in areas the force believes are more likely to contain those wanted for serious and violent offences. The RSA sent requests to 45 territorial police forces, receiving confirmation that eight were using or trialling AI or ADS for policing decisions, including Durham Constabulary, Surrey Police and West Yorkshire Police.