Advanced technology may indicate how brain learns faces


Facial recognition technology has advanced swiftly in the last five years. As University of Texas at Dallas researchers try to determine how computers have gotten as good as people at the task, they are also shedding light on how the human brain sorts information. UT Dallas scientists have analyzed the performance of the latest echelon of facial recognition algorithms, revealing the surprising way these programs--which are based on machine learning--work. Their study, published online Nov. 12 in Nature Machine Intelligence, shows that these sophisticated computer programs--called deep convolutional neural networks (DCNNs)--figured out how to identify faces differently than the researchers expected. "For the last 30 years, people have presumed that computer-based visual systems get rid of all the image-specific information--angle, lighting, expression and so on," said Dr. Alice O'Toole, senior author of the study and the Aage and Margareta Møller Professor in the School of Behavioral and Brain Sciences.