COVER A conceptual illustration of an artificial neuron evokes a technology that is transforming many fields of science: artificial intelligence (AI). One common form of AI is a neural network, which "learns" as connections between simulated neurons change in response to inputs. Such systems can find meaningful patterns in vast data sets, ranging from genomics to astronomy, and are even beginning to design experiments.
Manuel Le Gallo's research will inspire a new generation of extremely dense neuromorphic computing systems. Inspired by the way the human brain functions, a team of scientists at IBM Research in Zurich, have imitated the way neurons spike, for example when we touch a hot plate. These so-called artificial neurons can be used to detect patterns and discover correlations in Big Data with power budgets and at densities comparable to those seen in biology, something which scientists strived to accomplish for decades. They can also learn, unsupervised at high speeds using very little energy. The paper entitled "Stochastic phase-change neurons," which appeared today on the cover of Nature Nanotechnology, outlines the research and its findings.