Collaborating Authors

Intel Scales Neuromorphic Research System to 100 Million Neurons Intel Newsroom


What's New: Today, Intel announced the readiness of Pohoiki Springs, its latest and most powerful neuromorphic research system providing the computational capacity of 100 million neurons. The cloud-based system will be made available to members of the Intel Neuromorphic Research Community (INRC), extending their neuromorphic work to solve larger, more complex problems. The system enables our research partners to explore ways to accelerate workloads that run slowly today on conventional architectures, including high-performance computing (HPC) systems." What It is: Pohoiki Springs is a data center rack-mounted system and is Intel's largest neuromorphic computing system developed to date. Loihi processors take inspiration from the human brain.

Neuromorphic Computing Market Technology and Rising Demand For Artificial Intelligence


Neuromorphic computing or neuromorphic engineering has been described as the use of large integration systems containing numerous analog circuits allowing the replication of neuro-biological behaviors existing in a human's nervous system. The neuromorphic computing market platform consists of two vital systems based on the custom hardware architecture. Such systems are designed to program neural microcircuits by applying brain-like thought process in cognitive computing and machine learning process. This procedure enables a machine to learn, adapt and function like a human brain does rather than functioning like a normal computer. In addition, to perform such a complex task, the computing platform requires the state-of-the-art circuit technologies and electronic components, which allows the platform to receive new data or knowledge gained from various other sources of neuroscience research, e.g.

Neuromophic Computing


I saw a video article on Neuromorphic Computing the other day - something I had not really heard much about, though it ties in heavily to Artificial Intelligence which I, of course, do know about. Wow.. the possibilities are now endless. This is what Techopedia says about Neuromorphic Computing... Neuromorphic computing utilizes an engineering approach or method based on the activity of the biological brain. This type of approach can make technologies more versatile and adaptable, and promote more vibrant results than other types of traditional architectures, for instance, the von Neumann architecture that is so useful in traditional hardware design. Neuromorphic computing is also known as neuromorphic engineering.

Better AI research depends on benchmarks, Intel guru says


The head of the Intel's Neuromorphic Computing Lab wants a standard set of academic and industry benchmarks to track research progress for cloud and AI work. He offered up a few starting points towards such benchmarks in a recent scholarly article in Nature Machine Intelligence. It's a big ask, but not all that different from what goes in most of science: Why go to the moon? Why even have cars, aren't horses good enough? For Pete's sake, why invent the wheel?

Intel to Release Neuromorphic-Computing System


Neuromorphic chips are expected to be the predominant computing architecture for new, advanced forms of artificial-intelligence deployments by 2025, according to technology research firm Gartner Inc. By that year, Gartner predicts, the technology is expected to displace graphics processing units, one of the main computer chips used for AI systems, especially neural networks. Neural networks are used in speech recognition and understanding, as well as computer vision. With neuromorphic computing, it is possible to train machine-learning models using a fraction of the data it takes to train them on traditional computing hardware. That means the models learn similarly to the way human babies learn, by seeing an image or toy once and being able to recognize it forever, said Mike Davies, director of Intel's Neuromorphic Computing Lab.