Spintronic memory cells for neural networks

#artificialintelligence 

In recent years, researchers have proposed a wide variety of hardware implementations for feed-forward artificial neural networks. These implementations include three key components: a dot-product engine that can compute convolution and fully-connected layer operations, memory elements to store intermediate inter and intra-layer results, and other components that can compute non-linear activation functions. Dot-product engines, which are essentially high-efficiency accelerators, have so far been successfully implemented in hardware in many different ways. In a study published last year, researchers at the University of Notre Dame in Indiana used dot-product circuits to design a cellular neural network (CeNN)-based accelerator for convolutional neural networks (CNNs). The same team, in collaboration with other researchers at the University of Minnesota, has now developed a CeNN cell based on spintronic (i.e., spin electronic) elements with high energy efficiency.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found