Achieving the upper limits of face identification accuracy in forensic applications can minimize errors that have profound social and personal consequences. Although forensic examiners identify faces in these applications, systematic tests of their accuracy are rare. How can we achieve the most accurate face identification: using people and/or machines working alone or in collaboration? In a comprehensive comparison of face identification by humans and computers, we found that forensic facial examiners, facial reviewers, and superrecognizers were more accurate than fingerprint examiners and students on a challenging face identification test. Individual performance on the test varied widely.
A study appearing today in the Proceedings of the National Academy of Sciences has brought answers. In work that combines forensic science with psychology and computer vision research, a team of scientists from the National Institute of Standards and Technology (NIST) and three universities has tested the accuracy of professional face identifiers, providing at least one revelation that surprised even the researchers: Trained human beings perform best with a computer as a partner, not another person. "This is the first study to measure face identification accuracy for professional forensic facial examiners, working under circumstances that apply in real-world casework," said NIST electronic engineer P. Jonathon Phillips. "Our deeper goal was to find better ways to increase the accuracy of forensic facial comparisons." The team's effort began in response to a 2009 report by the National Research Council, "Strengthening Forensic Science in the United States: A Path Forward," which underscored the need to measure the accuracy of forensic examiner decisions.
Legal experts warn people's online photos are being used without permission to power facial-recognition technology that could eventually be used for surveillance. Legal experts warn people's online photos are being used without permission to power facial-recognition technology that could eventually be used for surveillance. Said New York University School of Law's Jason Schultz, "This is the dirty little secret of [artificial intelligence] training sets. Researchers often just grab whatever images are available in the wild." IBM recently issued a set of nearly 1 million photos culled from the image-hosting site Flickr, and programmed to describe subjects' appearance, allegedly to help reduce bias in facial recognition; although IBM said Flickr users can opt out of the database, deleting photos is almost impossible.
Scientists are working on a kickass new twist to the classic buddy cop movie genre. Get this: cyberterrorist Marcus Hurricane is going to walk free unless police detective Rick Danger can place him at the scene of the crime. But all he has to go on are some grainy security camera images, and he can't quite make out Hurricane's signature badass face scars. Enter: detective Danger's trusty AI cyborg sidekick, Sparky. Together, they have what it takes to save the day.
German authorities have launched a six-month trial of automatic facial recognition technology at a Berlin railway station. More than 200 people volunteered to have their names and two photos stored for the project at Suedkreuz station, where three cameras film an entrance and an escalator. While German authorities are optimistic about the programme, security experts say that there is a high potential for errors, which could allow criminals to slip through the system. Three cameras have been installed at Suedkreuz station that will film an entrance and an escalator. Footage will automatically be scanned by a computer programme, which will compare it with photos stored in a database.