Weakly-correlated synapses promote dimension reduction in deep neural networks

Zhou, Jianwen, Huang, Haiping

arXiv.org Machine Learning 

Neural correlation is a common characteristic transformation of sensory inputs. All incoming synapses in most neural computations [1], playing vital to a hidden neuron form a receptive field (RF) of that roles in stimulus coding [2, 3], information storage [4] hidden neuron. The correlation among synapses is modeled and various cognition tasks that can be implemented by the inter-RF correlation (Figure 1). We do not by recurrent neural networks [5, 6]. Neural correlation need a prior knowledge about the synaptic correlation was recently shown by a mean-field theory [7] to be able strength. In fact, our mean-field theory yields different to manipulate the dimensionality of layered representations scaling behaviors of synaptic correlation with respect to in deep computations, which was empirically revealed the number of neurons at each layer, for both binary and to be a fundamental process in deep artificial continuous synaptic weights. The scaling behaviors are neural networks [8]. This theory demonstrates that a exactly a requirement of mathematically well-defined dimensionality.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found