Plotting

 North America



Active Exploration in Dynamic Environments

Neural Information Processing Systems

Many real-valued connectionist approaches to learning control realize exploration by randomness inaction selection. This might be disadvantageous when costs are assigned to "negative experiences" . The basic idea presented in this paper is to make an agent explore unknown regions in a more directed manner. This is achieved by a so-called competence map, which is trained to predict the controller's accuracy, and is used for guiding exploration. Based on this, a bistable system enables smoothly switching attention between two behaviors - exploration and exploitation - depending on expected costsand knowledge gain. The appropriateness of this method is demonstrated by a simple robot navigation task.


Modeling Applications with the Focused Gamma Net

Neural Information Processing Systems

The focused gamma network is proposed as one of the possible implementations of the gamma neural model. The focused gamma network is compared with the focused backpropagation network and TDNN for a time series prediction problem, and with ADALINE in a system identification problem.



Constant-Time Loading of Shallow 1-Dimensional Networks

Neural Information Processing Systems

The complexity of learning in shallow I-Dimensional neural networks has been shown elsewhere to be linear in the size of the network. However, when the network has a huge number of units (as cortex has) even linear time might be unacceptable. Furthermore, the algorithm that was given to achieve this time was based on a single serial processor and was biologically implausible. In this work we consider the more natural parallel model of processing and demonstrate an expected-time complexity that is constant (i.e.


Bayesian Model Comparison and Backprop Nets

Neural Information Processing Systems

The Bayesian model comparison framework is reviewed, and the Bayesian Occam's razor is explained. This framework can be applied to feedforward networks, making possible (1) objective comparisons between solutions using alternative network architectures; (2) objective choice of magnitude and type of weight decay terms; (3) quantified estimates of the error bars on network parameters and on network output. The framework also generates ameasure of the effective number of parameters determined by the data. The relationship of Bayesian model comparison to recent work on prediction ofgeneralisation ability (Guyon et al., 1992, Moody, 1992) is discussed.


Unsupervised learning of distributions on binary vectors using two layer networks

Neural Information Processing Systems

We study a particular type of Boltzmann machine with a bipartite graph structure called a harmonium. Ourinterest is in using such a machine to model a probability distribution on binary input vectors. We analyze the class of probability distributions that can be modeled by such machines.


Induction of Multiscale Temporal Structure

Neural Information Processing Systems

Learning structure in temporally-extended sequences is a difficult computational problembecause only a fraction of the relevant information is available at any instant. Although variants of back propagation can in principle be used to find structure in sequences, in practice they are not sufficiently powerful to discover arbitrary contingencies, especially those spanning long temporal intervals or involving high order statistics. For example, in designing a connectionist network for music composition, we have encountered the problem that the net is able to learn musical structure thatoccurs locally in time-e.g., relations among notes within a musical phrase-butnot structure that occurs over longer time periods--e.g., relations among phrases. To address this problem, we require a means of constructing a reduced deacription of the sequence that makes global aspects more explicit or more readily detectable. I propose to achieve this using hidden units that operate with different time constants.


A Self-Organizing Integrated Segmentation and Recognition Neural Net

Neural Information Processing Systems

Standard pattern recognition systems usually involve a segmentation step prior to the recognition step. For example, it is very common in character recognition to segment characters in a pre-processing step then normalize the individual characters and pass them to a recognition engine such as a neural network, as in the work of LeCun et al. 1988, Martin and Pittman (1988). This separation between segmentation and recognition becomes unreliable if the characters are touching each other, touching bounding boxes, broken, or noisy. Other applications such as scene analysis or continuous speech recognition pose similar and more severe segmentation problems. The difficulties encountered in these applications present an apparent dilemma: one cannot recognize the patterns 496 *keeler@mcc.comReprint


Burst Synchronization without Frequency Locking in a Completely Solvable Neural Network Model

Neural Information Processing Systems

Recently synchronization phenomena in neural networks have attracted considerable attention. Gray et al. (1989, 1990) as well as Eckhorn et al. (1988) provided electrophysiological evidence that neurons in the visual cortex of cats discharge in a semi-synchronous, oscillatory manner in the 40 Hz range and that the firing activity of neurons up to 10 mm away is phase-locked with a mean phase-shift of less than 3 msec. It has been proposed that this phase synchronization can solve the binding problem for figure-ground segregation (von der Malsburg and Schneider, 1986) and underly visual attention and awareness (Crick and Koch, 1990). A number of theoretical explanations based on coupled (relaxation) oscillator mod-117 118 Schuster and Koch els have been proposed for burst synchronization (Sompolinsky et al., 1990). The crucial issue of phase synchronization has also recently been addressed by Bush and Douglas (1991), who simulated the dynamics of a network consisting of bursty, layer V pyramidal cells coupled to a common pool of basket cells inhibiting all pyramidal cells.