Neural Information Processing Systems
The Capacity of the Kanerva Associative Memory is Exponential
CA 94305 ABSTRACT The capacity of an associative memory is defined as the maximum number of vords that can be stored and retrieved reliably by an address vithin a given sphere of attraction. It is shown by sphere packing arguments that as the address length increases. This exponential grovth in capacity can actually be achieved by the Kanerva associative memory. Formulas for these op.timal values are provided. The exponential grovth in capacity for the Kanerva associative memory contrasts sharply vith the sub-linear grovth in capacity for the Hopfield associative memory.
New Hardware for Massive Neural Networks
Coon, Darryl D., Perera, A. G. Unil
ABSTRACT Transient phenomena associated with forward biased silicon p - n - n structures at 4.2K show remarkable similarities with biological neurons. The devices play a role similar to the two-terminal switching elements in Hodgkin-Huxley equivalent circuit diagrams. The devices provide simpler and more realistic neuron emulation than transistors or op-amps. They have such low power and current requirements that they could be used in massive neural networks. Some observed properties of simple circuits containing the devices include action potentials, refractory periods, threshold behavior, excitation, inhibition, summation over synaptic inputs, synaptic weights, temporal integration, memory, network connectivity modification based on experience, pacemaker activity, firing thresholds, coupling to sensors with graded signal outputsand the dependence of firing rate on input current.
A Computer Simulation of Olfactory Cortex with Functional Implications for Storage and Retrieval of Olfactory Information
Bower, James M., Wilson, Matthew A.
Using a simple Hebb-type learning rule in conjunction withthe cortical dynamics which emerge from the anatomical and physiological organization ofthe model, the simulations are capable of establishing cortical representations for different input patterns. The basis of these representations lies in the interaction of sparsely distributed, highly divergent/convergent interconnections between modeled neurons. We have shown that different representations can be stored with minimal interference.
Learning a Color Algorithm from Examples
Poggio, Tomaso A., Hurlbert, Anya C.
The algorithm, which resembles anew lightness algorithm recently proposed by Land, is approximately equivalent to filtering the image through a center-surround receptive field in individual chromatic channels.The synthesizing technique, optimal linear estimation, requires only one assumption, that the operator that transforms input into output is linear. This assumption is true for a certain class of early vision algorithms that may therefore be synthesized in a similar way from examples. Other methods of synthesizing algorithms from examples, or "learning", such as backpropagation, do not yield a significantly different orbetter lightness algorithm in the Mondrian world. The linear estimation and backpropagation techniques both produce simultaneous brightness contrast effects. The problems that a visual system must solve in decoding two-dimensional images into three-dimensional scenes (inverse optics problems) are difficult: the information supplied by an image is not sufficient by itself to specify a unique scene. To reduce the number of possible interpretations of images, visual systems, whether artificial or biological, must make use of natural constraints, assumptions about the physical properties of surfaces and lights. Computational vision scientists have derived effective solutions for some inverse optics problems (such as computing depth from binocular disparity) by determining the appropriate natural constraints and embedding them in algorithms. How might a visual system discover and exploit natural constraints on its own? We address a simpler question: Given only a set of examples of input images and desired output solutions, can a visual system synthesize.
Basins of Attraction for Electronic Neural Networks
Marcus, Charles M., Westervelt, R. M.
Basin measurement circuitry periodically opens the network feedback loop, loads raster-scanned initial conditions and examines the resulting attractor. Plotting the basins for fixed points (memories), we show that overloading an associative memory network leads to irregular basin shapes. The network also includes analog time delay circuitry, and we have shown that delay in symmetric networks can introduce basins for oscillatory attractors. Conditions leading to oscillation are related to the presence of frustration; reducing frustration by diluting the connections can stabilize a delay network.
Cycles: A Simulation Tool for Studying Cyclic Neural Networks
Thecomputer program, implemented on the Texas Instruments Explorer / Odyssey system, and the results of numerous experiments are discussed. The program, CYCLES, allows a user to construct, operate, and inspect neural networks containing cyclic connection paths with the aid of a powerful graphicsbased interface.Numerous cycles have been studied, including cycles with one or more activation points, non-interruptible cycles, cycles with variable path lengths, and interacting cycles. The final class, interacting cycles, is important due to its ability to implement time-dependent goal processing in neural networks. INTRODUCTION Neural networks are capable of many types of computation. However, the majority of researchers are currently limiting their studies to various forms of mapping systems; such as content addressable memories, expert system engines, and artificial retinas.