Faces recreated from monkey brain signals

BBC News

Scientists in the US have accurately reconstructed images of human faces by monitoring the responses of monkey brain cells. The brains of primates can resolve different faces with remarkable speed and reliability, but the underlying mechanisms are not fully understood. The researchers showed pictures of human faces to macaques and then recorded patterns of brain activity. The work could inspire new facial recognition algorithms, they report. In earlier investigations, Professor Doris Tsao from the California Institute of Technology (Caltech) and colleagues had used functional magnetic resonance imaging (fMRI) in humans and other primates to work out which areas of the brain were responsible for identifying faces.


Monkeys should be able to talk just like us – so why don't they?

New Scientist

Ooh, ooh, ooh, ee, ee, ee! Shouting monkeys may have more sophisticated vocal abilities than we give them credit for. It seems that the anatomy of their vocal tract is theoretically capable of producing the five basic vowel sounds on which most human languages are based – and these could be used to form intelligible sentences. For example, listen to what a monkey asking "Will you marry me?" would sound like in the audio file below: The results add to a growing body of evidence that some monkeys and apes can mimic or generate rudimentary sounds needed for speech-like communication. "No one can say now that there's a vocal anatomy problem with monkey speech," says Asif Ghazanfar at Princeton University, and co-leader of the study team. "They have a speech-ready vocal anatomy, but not a speech-ready brain.



Machine-learning system processes sounds like humans do

#artificialintelligence

Using a machine-learning system known as a deep neural network, MIT researchers have created the first model that can replicate human performance on auditory tasks such as identifying a musical genre. This model, which consists of many layers of information-processing units that can be trained on huge volumes of data to perform specific tasks, was used by the researchers to shed light on how the human brain may be performing the same tasks. "What these models give us, for the first time, is machine systems that can perform sensory tasks that matter to humans and that do so at human levels," says Josh McDermott, the Frederick A. and Carole J. Middleton Assistant Professor of Neuroscience in the Department of Brain and Cognitive Sciences at MIT and the senior author of the study. "Historically, this type of sensory processing has been difficult to understand, in part because we haven't really had a very clear theoretical foundation and a good way to develop models of what might be going on." The study, which appears in the April 19 issue of Neuron, also offers evidence that the human auditory cortex is arranged in a hierarchical organization, much like the visual cortex.


Machine-learning system processes sounds like humans do

#artificialintelligence

Using a machine-learning system known as a deep neural network, MIT researchers have created the first model that can replicate human performance on auditory tasks such as identifying a musical genre. This model, which consists of many layers of information-processing units that can be trained on huge volumes of data to perform specific tasks, was used by the researchers to shed light on how the human brain may be performing the same tasks. "What these models give us, for the first time, is machine systems that can perform sensory tasks that matter to humans and that do so at human levels," says Josh McDermott, the Frederick A. and Carole J. Middleton Assistant Professor of Neuroscience in the Department of Brain and Cognitive Sciences at MIT and the senior author of the study. "Historically, this type of sensory processing has been difficult to understand, in part because we haven't really had a very clear theoretical foundation and a good way to develop models of what might be going on." The study, which appears in the April 19 issue of Neuron, also offers evidence that the human auditory cortex is arranged in a hierarchical organization, much like the visual cortex.