Neurology


Computer Vision: Moving Far Beyond The Visual Cortex

#artificialintelligence

For humans, vision is one of the major senses for interacting with our environment. Lenses in our eyes focus light onto the retina. This image is transmitted as an electrical signal to the brain, which performs many types of processing. Simple processing can trigger reflexes that help us to avoid immediate dangers. More complex processing, performed in the visual cortex and other areas of the brain, enable us to more fully interact with our environment.



Michale Fee receives McKnight Technological Innovations in Neuroscience Award

MIT News

McGovern Institute investigator Michale Fee has been selected to receive a 2018 McKnight Technological Innovations in Neuroscience Award for his research on "new technologies for imaging and analyzing neural state-space trajectories in freely-behaving small animals." "I am delighted to get support from the McKnight Foundation," says Fee, who is also the Glen V. and Phyllis F. Dorflinger Professor in the Department of Brain and Cognitive Neurosciences at MIT. "We're very excited about this project which aims to develop technology that will be a great help to the broader neuroscience community." Fee studies the neural mechanisms by which the brain, specifically that of juvenile songbirds, learns complex sequential behaviors. The way that songbirds learn a song through trial and error is analogous to humans learning complex behaviors, such as riding a bicycle. While it would be insightful to link such learning to neural activity, current methods for monitoring neurons can only monitor a limited field of neurons, a big issue since such learning and behavior involve complex interactions between larger circuits.


How neuroscience enables better Artificial Intelligence design

#artificialintelligence

Artificial Intelligence (AI) is evolving at light-speed. Artificial systems are capable of outperforming human experts on many levels: crunching data, analysing legal documents, solving Rubix cubes, and winning games both ancient and modern. They can produce writing indistinguishable from their human counterparts, conduct research, pen pop songs, translate between multiple languages and even create and critique art. And AI-driven tasks like object detection, speech recognition and machine translation are becoming more sophisticated every day. These advances can be credited to many developments, from improved statistical approaches to increased computer processing powers.


MIT researchers have taught their AI to see through solid walls – Fanatical Futurist by International Keynote Speaker Matthew Griffin

#artificialintelligence

Recently we've seen camera developments from both China and MIT that help us see and take photos around corners, but now you don't need exotic infra red, radar or wifi to spot people through walls, apparently all you need are some easily detectable wireless signals and a dash of AI. Following on from another piece of research that let MIT researchers read peoples emotions using just the WiFi signals from their home routers, another team of researchers at MIT have developed a system, called RF-Pose, where RF stands for Radio Frequency, that uses a neural network to teach RF equipped devices to sense people's movement and postures behind obstacles, and it could be used to help people keep track of elderly relatives in their homes, help gamers turn the house into a giant battleground, and help rescuers rescue people. The team trained their AI to recognise human motion in RF by showing it examples of both on camera movement and signals reflected from people's bodies, helping it understand how the reflections correlate to a given posture. From there the AI could use wireless alone to estimate someone's movements and represent them using stick figures. The scientists mainly see their invention as useful for health care, for the moment anyway, where it could be used to track the development of diseases like Multiple Sclerosis and Parkinson's disease.


Artificial intelligence: Activating humanity's god mode

#artificialintelligence

The heyday of World of Warcraft saw online players interacting together and forming guilds. Today, artificial intelligence is sophisticated enough that computers are beginning to fill that role instead. Kumi Taguchi sits down with experts at the cutting-edge of AI, gaming and the religious experience. In this episode: Artificial intelligence is a rapidly advancing field. Robots can now cook, teach children and help care for patients with dementia, making them more and more human-like.


The big problem with big data? Without theory, it's just garbage

#artificialintelligence

Uta Frith doesn't want to meet Donald Trump. "There would be no point in my saying anything to him," she says. "Mostly, when scientists give advice to politicians, politicians listen only to the things they want to hear." Frith, a developmental psychologist who works at University College London, should know. Not only has she been a pioneer in the study of dyslexia and autism -- in the 1960s, she was one of the first researchers in the UK to study Asperger's Syndrome -- but she has also been working to advance the interests of women in science for decades.


As brain extracts meaning from vision, study tracks progression of processing

MIT News

The study, led by researchers at MIT's Picower Institute for Learning and Memory, undermines the classic belief that separate cortical regions play distinct roles. Instead, as animals in the lab refined what they saw down to a specific understanding relevant to behavior, brain cells in each of six cortical regions operated along a continuum between sensory processing and categorization. To be sure, general patterns were evident for each region, but activity associated with categorization was shared surprisingly widely, say the authors of the study published in the Proceedings of the National Academy of Science. "The cortex is not modular," says Earl Miller, Picower Professor of Neuroscience in the Department of Brain and Cognitive Sciences at MIT. "Different parts of the cortex emphasize different things and do different types of processing, but it is more of a matter of emphasis. This extends up to higher cognition."


Your first memory probably isn't yours, no matter how real it seems

Popular Science

Think back to your earliest memory. What age were you in it? In a recent survey, 40 percent of people say they remember events earlier than age two. But here's the problem: Most memory researchers argue that its essentially impossible to remember anything before those terrible twos. Understanding how and why our brains form memories in the first place might convince you that if you're in that 40 percent, perhaps your memory is a fictional one after all.


The questionable ethics of treating autistic children with robots

#artificialintelligence

One day in the spring of 2017, at the department of clinical psychology at Babes-Bolyai University, Romania, a robot stood on a table facing a child. The robot was a half-metre tall humanoid in brightly coloured plastic, like a toy. Its round eyes lit up as it spoke, its voice childlike. Across the table sat a young boy in a Pokémon T-shirt, playing a game where he had to figure out which object the lit-up eyes are looking at. Over the table-top between the pair was a horizontal display, showing two digital items, a flower and a tree.