Erik Learned-Miller is one reason we talk about facial recognition at all. In 2007, years before the current A.I. boom made "deep learning" and "neural networks" common phrases in Silicon Valley, Learned-Miller and three colleagues at the University of Massachusetts Amherst released a dataset of faces titled Labelled Faces in the Wild. To you or me, Labelled Faces in the Wild just looks like folders of unremarkable images. You can download them and look for yourself. There's boxer Joe Gatti, gloves raised mid-fight.
In previous work [6, 9, 10], we advanced a new technique for direct visual matching of images for the purposes of face recognition and image retrieval, using a probabilistic measure of similarity based primarily on a Bayesian (MAP) analysis of image differences, leadingto a "dual" basis similar to eigenfaces . The performance advantage of this probabilistic matching technique over standard Euclidean nearest-neighbor eigenface matching was recently demonstrated using results from DARPA's 1996 "FERET" face recognition competition, in which this probabilistic matching algorithm was found to be the top performer. We have further developed a simple method of replacing the costly compution of nonlinear (online) Bayesian similarity measures by the relatively inexpensive computation of linear (offline) subspace projections and simple (online) Euclidean norms, thus resulting in a significant computational speedup for implementation with very large image databases as typically encountered in real-world applications.
This week's furor over FaceApp has largely centered on concerns that its Russian developers might be compelled to share the app's data with the Russian government, much as the Snowden disclosures illustrated the myriad ways in which American companies were compelled to disclose their private user data to the US government. Yet the reality is that this represents a mistaken understanding of just how the modern data trade works today and the simple fact that American universities and companies routinely make their data available to companies all across the world, including in Russia and China. In today's globalized world, data is just as globalized, with national borders no longer restricting the flow of our personal information - trend made worse by the data-hungry world of deep learning. Data brokers have long bought and sold our personal data in a shadowy world of international trade involving our most intimate and private information. The digital era has upended this explicit trade through the interlocking world of passive exchange through analytics services.
US army researchers have developed a convolutional neural network and a range of algorithms to recognise faces in the dark. "This technology enables matching between thermal face images and existing biometric face databases or watch lists that only contain visible face imagery," explained Benjamin Riggan on Monday, co-author of the study and an electronics engineer at the US army laboratory. "The technology provides a way for humans to visually compare visible and thermal facial imagery through thermal-to-visible face synthesis." The thermal images are processed and passed to a convolutional neural network to extract facial features using landmarks that mark the corners of the eyes, nose and lips to determine its overall shape. The system, dubbed "multi-region synthesis" is trained with a loss function so that the error between the thermal images and the visible ones is minimized, creating an accurate portrayal of what someone's face looks like despite only glimpsing it in the dark.
A growing backlash against face recognition suggests the technology has a reached a crucial tipping point, as battles over its use are erupting on numerous fronts. Face-tracking cameras have been trialled in public by at least three UK police forces in the last four years. A court case against one force, South Wales Police, began earlier this week, backed by human rights group Liberty. Ed Bridges, an office worker from Cardiff whose image was captured during a test in 2017, says the technology is an unlawful violation of privacy, an accusation the police force denies. Avoiding the camera's gaze has got others in trouble.