High Order Neural Networks for Efficient Associative Memory Design
Dreyfus, Gérard, Guyon, Isabelle, Nadal, Jean-Pierre, Personnaz, Léon
–Neural Information Processing Systems
The designed networks exhibit the desired associative memory function: perfect storage and retrieval of pieces of information and/or sequences of information of any complexity. INTRODUCTION In the field of information processing, an important class of potential applications of neural networks arises from their ability to perform as associative memories. Since the publication of J. Hopfield's seminal paper1, investigations of the storage and retrieval properties of recurrent networks have led to a deep understanding of their properties. The basic limitations of these networks are the following: - their storage capacity is of the order of the number of neurons; - they are unable to handle structured problems; - they are unable to classify non-linearly separable data. American Institute of Physics 1988 234 In order to circumvent these limitations, one has to introduce additional non-linearities. This can be done either by using "hidden", nonlinear units, or by considering multi-neuron interactions2. This paper presents learning rules for networks with multiple interactions, allowing the storage and retrieval, either of static pieces of information (autoassociative memory), or of temporal sequences (associative memory), while preventing an explosive growth of the number of synaptic coefficients. AUTOASSOCIATIVEMEMORY The problem that will be addressed in this paragraph is how to design an autoassociative memory with a recurrent (or feedback) neural network when the number p of prototypes is large as compared to the number n of neurons. We consider a network of n binary neurons, operating in a synchronous mode, with period t.
Neural Information Processing Systems
Dec-31-1988