Police technology capable of identifying people from video footage is now so advanced it is'running ahead of the law', a UK government report has claimed. This method of identification, using biological features, is known as biometrics. Investment from both public and private companies has allowed advances in Big Brother-style facial recognition and biometric technology to increase rapidly in recent years. However, legislation has not moved at the same rate. The commissioner of a new government report claims new legislation is'urgently' needed to protect the privacy of innocent people.
With an estimated 4.2 million CCTV cameras in the UK alone, it will come as no surprise that you're caught on camera on a regular basis. But a shocking new report has revealed that the images of you from CCTV might also be scanned by facial recognition technology – regardless of whether you're doing anything wrong. The report suggests that the UK police have amassed a collection of 19 million photos – equating to around 30 per cent of the British population. A shocking new report has revealed that the images of you from CCTV might also be scanned by facial recognition technology – regardless of whether you're doing anything wrong The findings come from the annual report, Commissioner for the Retention and Use of Biometric Material, written by Paul Wiles, the biometrics commissioner. He says that the database of faces collected from CCTV footage is designed to weed out criminals, but contains images of many people with no criminal dealings.
The "rapid" growth of a police facial recognition database could lead to innocent people being unfairly targeted, a watchdog has warned. Biometrics Commissioner Paul Wiles said the Police National Database (PND) now had at least 19 million custody photographs on it. However, it is thought that hundreds of thousands of these could be of innocent people. The Home Office said police should delete images of unconvicted people. In a government review published in February, the Home Office concluded that those who are not convicted should have the right to request that their custody image is deleted from all police databases.
Facial recognition software used by the UK's biggest police force has returned false positives in more than 98 per cent of alerts generated, The Independent can reveal, with the country's biometrics regulator calling it "not yet fit for use". The Metropolitan Police's system has produced 104 alerts of which only two were later confirmed to be positive matches, a freedom of information request showed. In its response the force said it did not consider the inaccurate matches "false positives" because alerts were checked a second time after they occurred. Facial recognition technology scans people in a video feed and compares their images to pictures stored in a reference library or watch list. It has been used at large events like the Notting Hill Carnival and a Six Nations Rugby match.
Police forces hold more than 20million mugshots and this risks undermining public confidence, a watchdog has warned. Professor Paul Wiles, the independent Biometrics Commissioner, said there was a real danger the number of facial recognition images stored would rocket. He said the lack of laws controlling the use of the crime-fighting technology risked damaging confidence in the UK's model of policing by consent. Forces store more than 20million pictures and videos, known as custody images, taken at police stations of people they have arrested or questioned. Professor Paul Wiles, the independent Biometrics Commissioner, said there was a real danger the number of facial recognition images stored would rocket.