Accurate and Energy-Efficient Classification with Spiking Random Neural Network: Corrected and Expanded Version

Hussain, Khaled F., Bassyouni, Mohamed Yousef, Gelenbe, Erol

arXiv.org Machine Learning 

Despite being first proposed about 60 years ago [1], only in the past few years have artificial neural networks (ANNs) become the de facto standard machine learning model [2] achieving accurate state-of-the-art results for a wide range of problems ranging from image classification [3]-[5], object detection [6], [7], semantic segmentation [8], [9], face recognition [10], [11], and text recognition [12], [13], to speech recognition [14]-[16], natural language processing problems such as machine translation [17], [18], language modeling [19], and question answering [20]. This has resulted in a huge industry-wide adoption from leading technology companies such as Google, Facebook, Microsoft, IBM, Yahoo!, Twitter, Adobe, and a quickly growing number of startups. One of the prominent reasons for this recent revival is that in order for ANNs to achieve such performance they need very large labeled datasets and huge computational power at a scale that only recently came into the hands of individual researchers in the form of GPUs [21], which kick-started the deep learning revolution in 2012 [3]. Since then, the trend for demanding more computation and more power consumption for such applications has largely increased. Despite being initially bio-inspired architectures, ANNs have significant differences from actual biological neurons in how computations are performed by neurons, their structure (connection patterns and topologies of neurons), learning (how neurons adapt themselves to new observations), and communication (how inter-neuron data is encoded and passed). One of the main differences of ANNs compared to biological neurons, is how communication is done. While biological neurons use asynchronous trains of spikes in an event-based, data-driven manner that adapts locally to its external stimulation pattern to communicate and encode data (though the specific encoding mechanism used by neurons is not totally understood), ANNs communicate in dense, continuous valued activations, which means that all ANN neurons are working at the same time, thus using lots of computation and energy to operate. Spiking neural networks leverage the benefit from biological neurons to communicate asynchronously in trains of spikes. Thus, spiking neural networks incorporate the concept of time, and instead of all neurons firing at the same time as the case with ANNs, in spiking neural networks neurons fire only when thier intrinsic potential (i.e.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found