Goto

Collaborating Authors

 synaptic correlation


Correlations strike back (again): the case of associative memory retrieval

Neural Information Processing Systems

It has long been recognised that statistical dependencies in neuronal activity need to be taken into account when decoding stimuli encoded in a neural population. Less studied, though equally pernicious, is the need to take account of dependencies between synaptic weights when decoding patterns previously encoded in an auto-associative memory. We show that activity-dependent learning generically produces such correlations, and failing to take them into account in the dynamics of memory retrieval leads to catastrophically poor recall. We derive optimal network dynamics for recall in the face of synaptic correlations caused by a range of synaptic plasticity rules. These dynamics involve well-studied circuit motifs, such as forms of feedback inhibition and experimentally observed dendritic nonlinearities. We therefore show how addressing the problem of synaptic correlations leads to a novel functional account of key biophysical features of the neural substrate.


Weakly-correlated synapses promote dimension reduction in deep neural networks

Zhou, Jianwen, Huang, Haiping

arXiv.org Machine Learning

Neural correlation is a common characteristic transformation of sensory inputs. All incoming synapses in most neural computations [1], playing vital to a hidden neuron form a receptive field (RF) of that roles in stimulus coding [2, 3], information storage [4] hidden neuron. The correlation among synapses is modeled and various cognition tasks that can be implemented by the inter-RF correlation (Figure 1). We do not by recurrent neural networks [5, 6]. Neural correlation need a prior knowledge about the synaptic correlation was recently shown by a mean-field theory [7] to be able strength. In fact, our mean-field theory yields different to manipulate the dimensionality of layered representations scaling behaviors of synaptic correlation with respect to in deep computations, which was empirically revealed the number of neurons at each layer, for both binary and to be a fundamental process in deep artificial continuous synaptic weights. The scaling behaviors are neural networks [8]. This theory demonstrates that a exactly a requirement of mathematically well-defined dimensionality.


Correlations strike back (again): the case of associative memory retrieval

Savin, Cristina, Dayan, Peter, Lengyel, Mate

Neural Information Processing Systems

It has long been recognised that statistical dependencies in neuronal activity need to be taken into account when decoding stimuli encoded in a neural population. Less studied, though equally pernicious, is the need to take account of dependencies between synaptic weights when decoding patterns previously encoded in an auto-associative memory. We show that activity-dependent learning generically produces such correlations, and failing to take them into account in the dynamics of memory retrieval leads to catastrophically poor recall. We derive optimal network dynamics for recall in the face of synaptic correlations caused by a range of synaptic plasticity rules. These dynamics involve well-studied circuit motifs, such as forms of feedback inhibition and experimentally observed dendritic nonlinearities. We therefore show how addressing the problem of synaptic correlations leads to a novel functional account of key biophysical features of the neural substrate.