The use of facial recognition by police and other law enforcement is proving divisive, with Verdict readers split over its use. In a poll on Verdict that saw responses from 644 readers between 24 January and 7 February, the majority said they were not happy with the use of facial recognition by police, but only by a slim margin. The response comes as the EU is considering a ban on the use of facial recognition until the technology reaches a greater stage of maturity. A draft white paper, which was first published by the news website EURACTIV in January, showed that a temporary ban was being considered by the European Commission. It proposed that "use of facial recognition technology by private or public actors in public spaces would be prohibited for a definite period (e.g.
A legal code of practice is needed before face recognition technology can be safely deployed by police forces in public places, says the UK's data regulator. The Information Commissioner's Office (ICO) said it has serious concerns about the use of the technology as it relies on large amounts of personal information, in a blog post. Current laws, codes and practices "will not drive the ethical and legal approach that's needed to truly manage the risk that this technology presents," said information commissioner Elizabeth Denham. She called for police forces to be compelled to show justification that face recognition is "strictly necessary, balanced and effective" in each case it is deployed. Face recognition can map faces in a crowd by measuring the distance between facial features, then compare results with a "watch list" of images, which can include suspects, missing people and persons of interest.
The last day of January 2019 was sunny, yet bitterly cold in Romford, east London. Shoppers scurrying from retailer to retailer wrapped themselves in winter coats, scarves and hats. The temperature never rose above three degrees Celsius. For police officers positioned next to an inconspicuous blue van, just metres from Romford's Overground station, one man stood out among the thin winter crowds. The man, wearing a beige jacket and blue cap, had pulled his jacket over his face as he moved in the direction of the police officers.
Gordon's wine bar is reached through a discreet side-door, a few paces from the slipstream of London theatregoers and suited professionals powering towards their evening train. A steep staircase plunges visitors into a dimly lit cavern, lined with dusty champagne bottles and faded newspaper clippings, which appears to have had only minor refurbishment since it opened in 1890. "If Miss Havisham was in the licensing trade," an Evening Standard review once suggested, "this could have been the result." The bar's Dickensian gloom is a selling point for people embarking on affairs, and actors or politicians wanting a quiet drink – but also for pickpockets. When Simon Gordon took over the family business in the early 2000s, he would spend hours scrutinising the faces of the people who haunted his CCTV footage. "There was one guy who I almost felt I knew," he says. "He used to come down here the whole time and steal." The man vanished for a six-month stretch, but then reappeared, chubbier, apparently after a stint in jail.
Images of seven people were passed on by local police for use in a facial recognition system at King's Cross in London in an agreement that was struck in secret, the details of which were made public for the first time today. A police report, published by the deputy London mayor Sophie Linden on Friday, showed that the scheme ran for two years from 2016 without any apparent central oversight from either the Metropolitan police or the office of the mayor, Sadiq Khan. Writing to London assembly members, Linden said she "wanted to pass on the [Metropolitan police service's] apology" for failing to previously disclose that the scheme existed and announced that similar local image sharing agreements were now banned. There had been "no other examples of images having been shared with private companies for facial recognition purposes" by the Met, Linden said, according to "the best of its knowledge and record-keeping". The surveillance scheme – controversial because it involved tracking individuals without their consent – was originally agreed between borough police in Camden and the owner of the 27-hectare King's Cross site in 2016.
Privacy campaigners have warned of an "epidemic" of facial recognition use in shopping centres, museums, conference centres and other private spaces around the UK. An investigation by Big Brother Watch (BBW), which tracks the use of surveillance, has found that private companies are spearheading a rollout of the controversial technology. The group published its findings a day after the information commissioner, Elizabeth Denham, announced she was opening an investigation into the use of facial recognition in a major new shopping development in central London. Sadiq Khan, the mayor of London, has already raised questions about the legality of the use of facial recognition at the 27-hectare (67-acre) Granary Square site in King's Cross after its owners admitted using the technology "in the interests of public safety". BBW said it had uncovered that sites across the country were using facial recognition, often without warning visitors.
Artificial intelligence software capable of interpreting images, matching faces and analysing patterns of communication is being piloted by UK police forces to speed up examination of mobile phones seized in crime investigations. Cellebrite, the Israeli-founded and now Japanese-owned company behind some of the software, claims a wider rollout would solve problems over failures to disclose crucial digital evidence that have led to the collapse of a series of rape trials and other prosecutions in the past year. However, the move by police has prompted concerns over privacy and the potential for software to introduce bias into processing of criminal evidence. As police and lawyers struggle to cope with the exponential rise in data volumes generated by phones and laptops in even routine crime cases, the hunt is on for a technological solution to handle increasingly unmanageable workloads. Some forces are understood to have backlogs of up to six months for examining downloaded mobile phone contents.
The accuracy of police facial recognition systems has been criticised by a UK privacy group. Two forces have been testing facial recognition cameras at public events in an effort to catch wanted criminals. Big Brother Watch said its investigation showed the technology was "dangerous and inaccurate" as it had wrongly flagged up a "staggering" number of innocent people as suspects. But police have defended its use and say additional safeguards are in place. Police facial recognition cameras have been trialled at events such as football matches, festivals and parades.
At the end of each summer for the last 14 years, the small Welsh town of Porthcawl has been invaded. Every year its 16,000 population is swamped by up to 35,000 Elvis fans. Many people attending the yearly festival look the same: they slick back their hair, throw on oversized sunglasses and don white flares. At 2017's Elvis festival, impersonators were faced with something different. Police were trialling automated facial recognition technology to track down criminals.
Facial recognition technology used by the UK police is making thousands of mistakes, a new report has found. South Wales Police, London's Met and Leicestershire are all trialling automated facial recognition systems in public places to identify wanted criminals. According to police figures, the system often makes more incorrect matches than correct ones. Experts warned the technology could lead to false arrests and described it as a'dangerously inaccurate policing tool'. South Wales Police has been testing an automated facial recognition system.