New computational algorithms make it possible to build neural networks with many input nodes and many layers, and distinguish "deep learning" of these networks from previous work on artificial neural nets.
Researchers have built a sensor capable of recording signals from the human brain in record-breaking detail, opening up new possibilities for brain-computer interfaces. A team of engineers and surgeons, led by University of California San Diego professor Shadi Dayeh, used a densely packed grid embedded with thousands of electrocorticography (EC0G) sensors to allow them to read activity from the brain's cortex in 100 times higher resolution than existing technologies. Early applications could include surgeons receiving ultra clear brain signal information, providing better guidance for removing tumours without damaging healthy tissue, as well as surgically treating drug-resistant epilepsy. Longer-term, the brain device could be used as a permanent wireless implant to assist people living with paralysis or other neurodegenerative diseases like Parkinson's, which can be treated with electrical stimulation. Beyond that, the ECoG technology could be developed for use in the emerging field of brain-computer interfaces, which have a huge range of potential applications – from controlling a computer just by thinking, to streaming music directly to your brain.
The first wireless commands to a computer have been demonstrated in a breakthrough for people with paralysis. The system is able to transmit brain signals at "single-neuron resolution and in full broadband fidelity", say researchers at Brown University in the US. A clinical trial of the BrainGate technology involved a small transmitter that connects to a person's brain motor cortex. Trial participants with paralysis used the system to control a tablet computer, the journal IEEE Transactions on Biomedical Engineering reports. The participants were able to achieve similar typing speeds and point-and-click accuracy as they could with wired systems.
In the summer of 2009, the Israeli neuroscientist Henry Markram strode onto the TED stage in Oxford, England, and made an immodest proposal: Within a decade, he said, he and his colleagues would build a complete simulation of the human brain inside a supercomputer. They'd already spent years mapping the cells in the neocortex, the supposed seat of thought and perception. "It's a bit like going and cataloging a piece of the rain forest," Markram explained. "How many trees does it have? What shapes are the trees?"
There is a new race in Silicon Valley involving Artificial Intelligence and no it's not HealthTech, FinTech, Voice Commerce or involve Google, Facebook or Microsoft... this race involves the brain and more specifically brain-computer interfaces. This race also involves technology royalty, the US government, billion dollar defence companies, a big connection to PayPal and years of medical research to better understand the human brain and implant devices that could make a consumer brain-computer interface a reality. The race is called "Neural implants, merging the human brain with AI" So what exactly are neural implants? Brain implants, often referred to as neural implants, are technological devices that connect directly to a biological subject's brain – usually placed on the surface of the brain, or attached to the brain's cortex. A common purpose of modern brain implants and the focus of much current research is establishing a biomedical prosthesis circumventing areas in the brain that have become dysfunctional after a stroke or other head injuries.
Somewhat unceremoniously, Facebook this week provided an update on its brain-computer interface project, preliminary plans for which it unveiled at its F8 developer conference in 2017. In a paper published in the journal Nature Communications, a team of scientists at the University of California, San Francisco backed by Facebook Reality Labs -- Facebook's Pittsburgh-based division devoted to augmented reality and virtual reality R&D -- described a prototypical system capable of reading and decoding study subjects' brain activity while they speak. It's impressive no matter how you slice it: The researchers managed to make out full, spoken words and phrases in real time. Study participants (who were prepping for epilepsy surgery) had a patch of electrodes placed on the surface of their brains, which employed a technique called electrocorticography (ECoG) -- the direct recording of electrical potentials associated with activity from the cerebral cortex -- to derive rich insights. A set of machine learning algorithms equipped with phonological speech models learned to decode specific speech sounds from the data and to distinguish between questions and responses.
Your brain is one enigmatic hunk of meat--a wildly complex web of neurons numbering in the tens of billions. But years ago, when you were in the womb, it began as little more than a scattering of undifferentiated stem cells. A series of genetic signals transformed those blank slates into the wrinkly, three-pound mass between your ears. Scientists think the way your brain looks and functions can be traced back to those first molecular marching orders--but precisely when and where these genetic signals occur has been difficult to pin down.
Male territoriality is a pretty well-defined scientific concept. Some animals mark their domain with rocks or urine, others attack intruders (and we've all seen guys who pick fights at the bar). Researchers at Stanford University Medical Center have taken a closer look at the roots of this rage in the mouse brain, and in a study published today in Neuron, they pinpoint the brain cells that give rise to male territorial aggression. They also found that in some cases, the mice know when fighting would be a faux pas. The neurons in question are clustered in the ventromedial hypothalamus (VMH), located deep in the center of the brain in a region that plays a role in many hormonally-controlled activities--things like fear, eating and sexual activity.
Our brains make memories of many kinds -- how to walk and jump, facts and figures, our fears, the events in our lives. Nanthia Suthana of the UCLA Brain Research Institute studies the way we remember events. As depicted here, she explains what scientists believe happens when a person remembers her 21st birthday party. Images, sounds, smells and other stimuli from the party are translated into electrical signals and channeled to different parts of the cerebral cortex. The cerebral cortex then channels the signals to another part of the brain that will form, or encode, the memory.