Neuromorphic computing or neuromorphic engineering has been described as the use of large integration systems containing numerous analog circuits allowing the replication of neuro-biological behaviors existing in a human's nervous system. The neuromorphic computing market platform consists of two vital systems based on the custom hardware architecture. Such systems are designed to program neural microcircuits by applying brain-like thought process in cognitive computing and machine learning process. This procedure enables a machine to learn, adapt and function like a human brain does rather than functioning like a normal computer. In addition, to perform such a complex task, the computing platform requires the state-of-the-art circuit technologies and electronic components, which allows the platform to receive new data or knowledge gained from various other sources of neuroscience research, e.g.
I saw a video article on Neuromorphic Computing the other day - something I had not really heard much about, though it ties in heavily to Artificial Intelligence which I, of course, do know about. Wow.. the possibilities are now endless. This is what Techopedia says about Neuromorphic Computing... Neuromorphic computing utilizes an engineering approach or method based on the activity of the biological brain. This type of approach can make technologies more versatile and adaptable, and promote more vibrant results than other types of traditional architectures, for instance, the von Neumann architecture that is so useful in traditional hardware design. Neuromorphic computing is also known as neuromorphic engineering.
As the long-predicted end of Moore's Law seems ever more imminent, researchers around the globe are seriously evaluating a profoundly different approach to large-scale computing inspired by biological principles. In the traditional von Neumann architecture, a powerful logic core (or several in parallel) operates sequentially on data fetched from memory. In contrast, "neuromorphic" computing distributes both computation and memory among an enormous number of relatively primitive "neurons," each communicating with hundreds or thousands of other neurons through "synapses." Ongoing projects are exploring this architecture at a vastly larger scale than ever before, rivaling mammalian nervous systems, and developing programming environments that take advantage of them. Still, the detailed implementation, such as the use of analog circuits, differs between the projects, and it may be several years before their relative merits can be assessed.
Neuromorphic chips are expected to be the predominant computing architecture for new, advanced forms of artificial-intelligence deployments by 2025, according to technology research firm Gartner Inc. By that year, Gartner predicts, the technology is expected to displace graphics processing units, one of the main computer chips used for AI systems, especially neural networks. Neural networks are used in speech recognition and understanding, as well as computer vision. With neuromorphic computing, it is possible to train machine-learning models using a fraction of the data it takes to train them on traditional computing hardware. That means the models learn similarly to the way human babies learn, by seeing an image or toy once and being able to recognize it forever, said Mike Davies, director of Intel's Neuromorphic Computing Lab.
At first glance, the new breed of neuromorphic chips have several things in common with the similarly cutting-edge field of AI accelerators. Both are designed to process artificial neural networks, both offer improvements in performance compared to CPUs, and both claim to be more power efficient. That's where the similarity ends, though: Neuromorphic chips are designed only for special neural networks called spiking networks, and their structure is fundamentally different from anything seen in traditional computing (nothing so conventional as multiply-accumulate units). It is perhaps a too soon to say what the market for these devices will look like, as new applications and technologies continue to emerge. Everything You Need to Know about Neuromorphic Computing.