If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Researchers at the U.S. Army Combat Capabilities Development Command's Army Research Laboratory say this may be changing as they endeavor to design computers inspired by the human brain's neural structure. As part of a collaboration with Lehigh University, Army researchers have identified a design strategy for the development of neuromorphic materials. "Neuromorphic materials is a name given to the material categories or combination of materials that provide both computing and memory capabilities in devices," said Dr. Sina Najmaei, a research scientist and electrical engineer with the laboratory. Najmaei and his colleagues published a paper, Dynamically reconfigurable electronic and phononic properties in intercalated Hafnium Disulfide (HfS2), in the May 2020 issue of Materials Today. The neuromorphic computing concept is an in-memory solution that promises orders of magnitude reductions in power consumption over conventional transistors, and is suitable for complex data classification and processing.
Can AI function like a human brain? But now, armed with Neuromorphic Computing, they are ready to show the world that their dream can change the world for better. As we unearth the benefits, the success of our machine learning and AI quest seem to depend to a great extent on the success of Neuromorphic Computing. The technologies of the future like autonomous vehicles and robots will need access to and utilization of an enormous amount of data and information in real-time. Today, to a limited extent, this is done by machine learning and AI that depend on supercomputer power.
Over the past few decades, computers have seen dramatic progress in processing power; however, even the most advanced computers are relatively rudimentary in comparison with the complexities and capabilities of the human brain. Researchers at the U.S. Army Combat Capabilities Development Command's Army Research Laboratory say this may be changing as they endeavor to design computers inspired by the human brain's neural structure. As part of a collaboration with Lehigh University, Army researchers have identified a design strategy for the development of neuromorphic materials. "Neuromorphic materials is a name given to the material categories or combination of materials that provide both computing and memory capabilities in devices," said Dr. Sina Najmaei, a research scientist and electrical engineer with the laboratory. Najmaei and his colleagues published a paper, Dynamically reconfigurable electronic and phononic properties in intercalated Hafnium Disulfide (HfS2), in the May 2020 issue of Materials Today.
At first glance, the new breed of neuromorphic chips have several things in common with the similarly cutting-edge field of AI accelerators. Both are designed to process artificial neural networks, both offer improvements in performance compared to CPUs, and both claim to be more power efficient. That's where the similarity ends, though: Neuromorphic chips are designed only for special neural networks called spiking networks, and their structure is fundamentally different from anything seen in traditional computing (nothing so conventional as multiply-accumulate units). It is perhaps a too soon to say what the market for these devices will look like, as new applications and technologies continue to emerge. Everything You Need to Know about Neuromorphic Computing.
Everyone in the field of Artificial Intelligence knows what neural networks are. And most practitioners know the huge processing power and energy consumption needed to train pretty much any noteworthy neural network. That is to say, for the field to develop further, a new type of hardware is needed. Some experts consider that the quantum computer is that hardware. But even though it holds great promise, quantum computing is a technology that will take many decades to develop.
Intel's fifth-generation Loihi chip uses neuromorphic computing to learn faster on less training data than traditional artificial intelligence techniques -- including how to smell like a human does and make accurate conclusions based on a tiny dataset of essentially just one sample. "That's really one of the main things we're trying to understand and map into silicon … the brain's ability to learn with single examples," Mike Davies, the director of Intel's Neuromorphic Computing Lab, told me recently on The AI Show podcast. "So with just showing one clean presentation of an odor, we can store that in this high dimensional representation in the chip, and then it allows it to then recognize a variety of noisy, corrupted, occluded odors like you would be faced with in the real world." Neuromorphic computing has been around since the 1980s and is an attempt to use technology to mimic biological systems. Intel believes it is "the next generation of AI" and has designed its Loihi chip with neural units that approximate some functions of a human brain.
Nature Machine Intelligence published a joint paper from researchers at Intel Labs and Cornell University demonstrating the ability of Intel's neuromorphic test chip, Loihi, to learn and recognize 10 hazardous chemicals, even in the presence of significant noise and occlusion. The work demonstrates how neuromorphic computing could be used to detect smells that are precursors to explosives, narcotics and more. Loihi learned each new odor from a single example without disrupting the previously learned smells, requiring up to 3000x fewer training samples per class compared to a deep learning solution and demonstrating superior recognition accuracy. The research shows how the self-learning, low-power, and "brain-like" properties of neuromorphic chips – combined with algorithms derived from neuroscience – could be the answer to creating "electronic nose" systems that recognize odors under real-world conditions more effectively than conventional solutions. "We are developing neural algorithms on Loihi that mimic what happens in your brain when you smell something," said Nabil Imam, senior research scientist in Intel's Neuromorphic Computing Lab.
Intel has scaled up its neuromorphic computing system by integrating 768 of its Loihi chips into a 5 rack-unit system called Pohoiki Springs. This cloud-based system will be made available to Intel's Neuromorphic Research Community (INRC) to enable research and development of larger and more complex neuromorphic algorithms. Pohoiki Springs contains the equivalent of 100 million neurons, about the same number as in the brain of a small mammal such as a mole rat or a hamster. Neuromorphic Chip Intel debuted its Loihi neuromorphic chip for research applications in 2017. It mimics the architecture of the brain, using electrical pulses known as spikes, whose timing modulates the strength of the connections between neurons.
Scientists can use nanomaterials to mimic the human brain's structure. Despite its name, artificial intelligence isn't all that smart -- at least when compared to human brains. A.I.'s are excellent number crunchers and pattern finders, but when it comes to actual human-level cognition and problem-solving, they've still got a ways to go. But that distance could be quickly shortening, thanks to the emergence of a next-generation of A.I. called neuromorphic computing. Instead of teaching A.I.'s rigid logic gates and processors to learn through strict rules and datasets, neuromorphic computing takes a more biological approach to learning and designs computing systems that mimic a human brain's architecture of neurons and synapses.
Neuromorphic chips are expected to be the predominant computing architecture for new, advanced forms of artificial-intelligence deployments by 2025, according to technology research firm Gartner Inc. By that year, Gartner predicts, the technology is expected to displace graphics processing units, one of the main computer chips used for AI systems, especially neural networks. Neural networks are used in speech recognition and understanding, as well as computer vision. With neuromorphic computing, it is possible to train machine-learning models using a fraction of the data it takes to train them on traditional computing hardware. That means the models learn similarly to the way human babies learn, by seeing an image or toy once and being able to recognize it forever, said Mike Davies, director of Intel's Neuromorphic Computing Lab.