More on 3rd Generation Spiking Neural Nets

@machinelearnbot 

Recently we wrote about the development of AI and neural nets beyond the second generation Convolutional and Recurrent Neural Nets (CNNs / RNNs) which have come on so strong and dominate the current conversation about deep learning. The original charge by DARPA's SyNAPSE program has spread the work among many of these labs including Sandia, Oak Ridge, and Lawrence Livermore. "Neuromorphic computing is still in its beginning stages," says Dr. Catherine Schuman, a researcher working on such architectures at Oak Ridge National Laboratory. State of the art on the hardware side appears to still belong to IBM which recently delivered a supercomputing platform based on TrueNorth to Lawrence Livermore Lab with the equivalent of 16 million neurons and 4 billion synapses.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found