History-Dependent Attractor Neural Networks
Meilijson, Isaac, Ruppin, Eytan
–Neural Information Processing Systems
We present a methodological framework enabling a detailed description ofthe performance of Hopfield-like attractor neural networks (ANN) in the first two iterations. Using the Bayesian approach, wefind that performance is improved when a history-based term is included in the neuron's dynamics. A further enhancement of the network's performance is achieved by judiciously choosing the censored neurons (those which become active in a given iteration) onthe basis of the magnitude of their post-synaptic potentials. Thecontribution of biologically plausible, censored, historydependent dynamicsis especially marked in conditions of low firing activity and sparse connectivity, two important characteristics of the mammalian cortex. In such networks, the performance attained ishigher than the performance of two'independent' iterations, whichrepresents an upper bound on the performance of history-independent networks.
Neural Information Processing Systems
Dec-31-1993
- Country:
- Asia > Middle East
- Israel > Tel Aviv District > Tel Aviv (0.04)
- North America > United States (0.04)
- Asia > Middle East
- Industry:
- Law > Civil Rights & Constitutional Law (0.61)