Facial recognition technology used by the UK police is making thousands of mistakes, a new report has found. South Wales Police, London's Met and Leicestershire are all trialling automated facial recognition systems in public places to identify wanted criminals. According to police figures, the system often makes more incorrect matches than correct ones. Experts warned the technology could lead to false arrests and described it as a'dangerously inaccurate policing tool'. South Wales Police has been testing an automated facial recognition system.
At the end of each summer for the last 14 years, the small Welsh town of Porthcawl has been invaded. Every year its 16,000 population is swamped by up to 35,000 Elvis fans. Many people attending the yearly festival look the same: they slick back their hair, throw on oversized sunglasses and don white flares. At 2017's Elvis festival, impersonators were faced with something different. Police were trialling automated facial recognition technology to track down criminals.
A police force's use of facial recognition technology requires'considerable investment' to deliver consistent results, a study has concluded. Crashing computer systems and poor quality images are among the challenges South Wales Police officers have faced since rolling out the technology. Large crowds, low lighting and people wearing glasses were all issues the AI software struggles to cope with, experts found. South Wales Police force first deployed automated facial recognition at the 2017 Champions League final in Cardiff. This led to the technology wrongly matching more than 2,000 people to possible criminals.
When Wales takes on Ireland in the Six Nations rugby championship Saturday, Big Brother will be watching. Fans filing into the stadium in Cardiff will be scanned with facial recognition software as part of a police trial of the technology. Should any of their faces match a database of potential suspects, officers will be standing by, ready to swoop. It's the kind of indiscriminate mass surveillance that would be expected, in ordinary times, to be the subject of fierce debate in the U.K., as journalists and politicians fought over the proper balance between privacy and security. Instead, trial runs like the one in South Wales are taking place largely unchallenged by parliament.
Thousands of attendees of the 2017 Champions League final in Cardiff, Wales were mistakenly identified as potential criminals by facial recognition technology used by local law enforcement. According to the Guardian, the South Wales police scanned the crowd of more than 170,000 people who traveled to the nation's capital for the soccer match between Real Madrid and Juventus. The cameras identified 2,470 people as criminals. Having that many potential lawbreakers in attendance might make sense if the event was, say, a convict convention, but seems pretty high for a soccer match. As it turned out, the cameras were a little overly-aggressive in trying to spot some bad guys.