Collaborating Authors

Why we close our eyes

FOX News

Scientists have deduced the possible reason humans tend to close their eyes while kissing. The research, published March 15 in the Journal of Experimental Psychology: Human Perception and Performance, did not specifically study kissing but rather analyzed how visual stimuli can interfere with the senses. Researchers at Royal Holloway, University of London (RHU) had 16 volunteers simultaneously perform a letter search task of varying levels of difficulty. At the same time, they were tasked with reacting to the presence or absence of a short vibration to their right or left hand. Participants' sensitivity to the tactile stimulus was more reduced among those who had the more taxing visual search task.

Glia put visual map in sync


Watching a fireworks display is a breathtaking experience. As explosions pattern the sky, the visual system must capture information about the time-varying positions, colors, and contrasts of myriad spots of light.

'All I See Is You' is a sensual and visual experience

Los Angeles Times

The premise of "All I See Is You," wherein Blake Lively stars as a blind woman who has her sight restored, sounds unbearably sentimental. Thankfully, the film itself is far weirder than that. Director Marc Forster explores questions of identity in relationship to sensory experiences in this erotic-ish thriller, about a woman whose whole self opens up to the world -- for better or for worse -- after cutting-edge eye surgery restores the sight she lost as a child in a tragic accident. Forster, who wrote the script with Sean Conway, seems fascinated by creating a cinematic experience of blindness. It's a unique viewing experience, as he weaves a visual spectacle of morphing light and color, melding into abstract shapes, a kaleidoscope of fractured, fantastical images coupled with detailed sound design in an attempt to represent the perspective of Gina (Lively) and her experience of the world.

Decoding Neural Responses in Mouse Visual Cortex through a Deep Neural Network Artificial Intelligence

Finding a code to unravel the population of neural responses that leads to a distinct animal behavior has been a long-standing question in the field of neuroscience. With the recent advances in machine learning, it is shown that the hierarchically Deep Neural Networks (DNNs) perform optimally in decoding unique features out of complex datasets. In this study, we utilize the power of a DNN to explore the computational principles in the mammalian brain by exploiting the Neuropixel data from Allen Brain Institute. We decode the neural responses from mouse visual cortex to predict the presented stimuli to the animal for natural (bear, trees, cheetah, etc.) and artificial (drifted gratings, orientated bars, etc.) classes. Our results indicate that neurons in mouse visual cortex encode the features of natural and artificial objects in a distinct manner, and such neural code is consistent across animals. We investigate this by applying transfer learning to train a DNN on the neural responses of a single animal and test its generalized performance across multiple animals. Within a single animal, DNN is able to decode the neural responses with as much as 100% classification accuracy. Across animals, this accuracy is reduced to 91%. This study demonstrates the potential of utilizing the DNN models as a computational framework to understand the neural coding principles in the mammalian brain.

Google Lens visual search rolls out on iOS


After making a slow march across Android devices, Google's AI-powered visual search is coming to iOS. Apple device owners should see a preview of Google Lens pop up in the latest version of their Google Photos app over the next week. In case you've forgotten how it works, the idea is that your camera will recognize items in a picture and be able to take action with tie-ins to Google Assistant. Of course, now that you can use the technology the question is whether or not you should.