And Now, a Bicycle Built for None

#artificialintelligence

"That is where we see the big promise," said Mike Davies, who oversees Intel's efforts to build neuromorphic chips. Over the past decade, the development of artificial intelligence has accelerated thanks to what are called neural networks: complex mathematical systems that can learn tasks by analyzing vast amounts of data. By metabolizing thousands of cat photos, for instance, a neural network can learn to recognize a cat. This is the technology that recognizes faces in the photos you post to Facebook, identifies the commands you bark into your smartphone and translates between languages on internet services like Microsoft Skype. It is also hastening the advance of autonomous robots, including self-driving cars.


Neuromorphic Chipsets - Industry Adoption Analysis

#artificialintelligence

Von Neumann Architecture Neuromorphic Architecture Neuromorphic architectures address challenges like high power consumption, low speed, and other efficiency-related bottlenecks prevalent in the traditional von Neumann architecture Architecture Bottleneck CPU Memory Neuromorphic architectures integrate processing and storage, getting rid of the bus bottleneck connecting the CPU and memory Encoding Scheme and Signals Unlike the von Neumann architecture with sudden highs and lows in the form of binary encoding, neuromorphic chips offer a continuous analog transition in the form of spiking signals Devices and Components CPU, memory, logic gates, etc. Artificial neurons and synapses Neuromorphic devices and components are more complex than logic gates Versus Versus Versus 10. NEUROMORPHIC CHIPSETS 10 SAMPLE REPORT Neuromorphic Chipsets vs. GPUs Parameters Neuromorphic Chips GPU Chips Basic Operation Based on the emulation of the biological nature of neurons onto a chip Use parallel processing to perform mathematical operations Parallelism Inherent parallelism enabled by neurons and synapses Require the development of architectures for parallel processing to handle multiple tasks simultaneously Data Processing High High Power Low Power-intensive Accuracy Low High Industry Adoption Still in the experimental stage More accessible Software New tools and methodologies need to be developed for programming neuromorphic hardware Easier to program than neuromorphic silicons Memory Integrated memory and neural processing Use of an external memory Limitations • Not suitable for precise calculations and programming- related challenges • Creation of neuromorphic devices is difficult due to the complexity of interconnections • Thread limited • Suboptimal for massively parallel structures Neuromorphic chipsets are at an early stage of development, and would take approximately 20 years to be at the same level as GPUs. The asynchronous operation of neuromorphic chips makes them more efficient than other processing units.


Intel's neuro guru slams deep learning: 'it's not actually learning'

ZDNet

"Backpropogation doesn't correlate to the brain," insists Mike Davies, head of Intel's neuromorphic computing unit, dismissing one of the key tools of the species of A.I. Davies made the comment during a talk on Thursday at the International Solid State Circuits Conference in San Francisco, a prestigious annual gathering of semiconductor designers. Davies was returning fire after Facebook's Yann LeCun, a leading apostle of deep learning, earlier in the week dismissed Davies's own technology during LeCun's opening keynote for the conference. "The brain is the one example we have of truly intelligent computation," observed Davies. In contrast, so-called back-prop, invented in the 1980s, is a mathematical technique used to optimize the response of artificial neurons in a deep learning computer program. Although deep learning has proven "very effective," Davies told a ballroom of attendees, "there is no natural example of back-prop," he said, so it doesn't correspond to what one would consider real learning.


Intel's neuro guru slams deep learning: 'it's not actually learning' ZDNet

#artificialintelligence

"Backpropogation doesn't correlate to the brain," insists Mike Davies, head of Intel's neuromorphic computing unit, dismissing one of the key tools of the species of A.I. Davies made the comment during a talk on Thursday at the International Solid State Circuits Conference in San Francisco, a prestigious annual gathering of semiconductor designers. Davies was returning fire after Facebook's Yann LeCun, a leading apostle of deep learning, earlier in the week dismissed Davies's own technology during LeCun's opening keynote for the conference. "The brain is the one example we have of truly intelligent computation," observed Davies. In contrast, so-called back-prop, invented in the 1980s, is a mathematical technique used to optimize the response of artificial neurons in a deep learning computer program. Although deep learning has proven "very effective," Davies told a ballroom of attendees, "there is no natural example of back-prop," he said, so it doesn't correspond to what one would consider real learning.


What is neuromorphic computing?

#artificialintelligence

This article is part of Demystifying AI, a series of posts that (try to) disambiguate the jargon and myths surrounding AI. In July, a group of artificial intelligence researchers showcased a self-driving bicycle that could navigate around obstacles, follow a person, and respond to voice commands. While the self-driving bike itself was of little use, the AI technology behind it was remarkable. Powering the bicycle was a neuromorphic chip, a special kind of AI computer. Neuromorphic computing is not new.