Goto

Collaborating Authors

Police could face legal action over 'authoritarian' facial recognition cameras

Daily Mail - Science & tech

Facial recognition technology used by the UK police is making thousands of mistakes - and now there could be legal repercussions. Civil liberties group, Big Brother Watch, has teamed up with Baroness Jenny Jones to ask the government and the Met to stop using the technology. They claim the use of facial recognition has proven to be'dangerously authoritarian', inaccurate and a breach if rights protecting privacy and freedom of expression. If their request is rejected, the group says it will take the case to court in what will be the first legal challenge of its kind. South Wales Police, London's Met and Leicestershire are all trialling automated facial recognition systems in public places to identify wanted criminals.


Facial recognition tech used by UK police is making a ton of mistakes

#artificialintelligence

At the end of each summer for the last 14 years, the small Welsh town of Porthcawl has been invaded. Every year its 16,000 population is swamped by up to 35,000 Elvis fans. Many people attending the yearly festival look the same: they slick back their hair, throw on oversized sunglasses and don white flares. At 2017's Elvis festival, impersonators were faced with something different. Police were trialling automated facial recognition technology to track down criminals.


Police facial recognition camera trial hindered by software setbacks

Daily Mail - Science & tech

A police force's use of facial recognition technology requires'considerable investment' to deliver consistent results, a study has concluded. Crashing computer systems and poor quality images are among the challenges South Wales Police officers have faced since rolling out the technology. Large crowds, low lighting and people wearing glasses were all issues the AI software struggles to cope with, experts found. South Wales Police force first deployed automated facial recognition at the 2017 Champions League final in Cardiff. This led to the technology wrongly matching more than 2,000 people to possible criminals.


Facial Recognition Used by Wales Police Has 90 Percent False Positive Rate

#artificialintelligence

Thousands of attendees of the 2017 Champions League final in Cardiff, Wales were mistakenly identified as potential criminals by facial recognition technology used by local law enforcement. According to the Guardian, the South Wales police scanned the crowd of more than 170,000 people who traveled to the nation's capital for the soccer match between Real Madrid and Juventus. The cameras identified 2,470 people as criminals. Having that many potential lawbreakers in attendance might make sense if the event was, say, a convict convention, but seems pretty high for a soccer match. As it turned out, the cameras were a little overly-aggressive in trying to spot some bad guys.


Metropolitan Police's facial recognition technology 98% inaccurate, figures show

The Independent - Tech

Facial recognition software used by the UK's biggest police force has returned false positives in more than 98 per cent of alerts generated, The Independent can reveal, with the country's biometrics regulator calling it "not yet fit for use". The Metropolitan Police's system has produced 104 alerts of which only two were later confirmed to be positive matches, a freedom of information request showed. In its response the force said it did not consider the inaccurate matches "false positives" because alerts were checked a second time after they occurred. Facial recognition technology scans people in a video feed and compares their images to pictures stored in a reference library or watch list. It has been used at large events like the Notting Hill Carnival and a Six Nations Rugby match.