Effective Learning Requires Neuronal Remodeling of Hebbian Synapses

Neural Information Processing Systems 

This paper revisits the classical neuroscience paradigm of Hebbian learning. We find that a necessary requirement for effective as(cid:173) sociative memory learning is that the efficacies of the incoming synapses should be uncorrelated. This requirement is difficult to achieve in a robust manner by Hebbian synaptic learning, since it depends on network level information. Effective learning can yet be obtained by a neuronal process that maintains a zero sum of the in(cid:173) coming synaptic efficacies. This normalization drastically improves the memory capacity of associative networks, from an essentially bounded capacity to one that linearly scales with the network's size.