Goto

Collaborating Authors

 Country


The Observer-Observation Dilemma in Neuro-Forecasting

Neural Information Processing Systems

We explain how the training data can be separated into clean information andunexplainable noise. Analogous to the data, the neural network is separated into a time invariant structure used for forecasting, and a noisy part. We propose a unified theory connecting the optimization algorithms forcleaning and learning together with algorithms that control the data noise and the parameter noise. The combined algorithm allows a data-driven local control of the liability of the network parameters and therefore an improvement in generalization. The approach is proven to be very useful at the task of forecasting the German bond market.




Reinforcement Learning with Hierarchies of Machines

Neural Information Processing Systems

We present a new approach to reinforcement learning in which the policies consideredby the learning process are constrained by hierarchies of partially specified machines. This allows for the use of prior knowledge to reduce the search space and provides a framework in which knowledge can be transferred across problems and in which component solutions can be recombined to solve larger and more complicated problems. Our approach can be seen as providing a link between reinforcement learning and"behavior-based" or "teleo-reactive" approaches to control. We present provably convergent algorithms for problem-solving and learning withhierarchical machines and demonstrate their effectiveness on a problem with several thousand states.


An Annealed Self-Organizing Map for Source Channel Coding

Neural Information Processing Systems

It is especially suited for speech and image data which in many applieations have to be transmitted under low bandwidth/high noise level conditions. Followingthe idea of (Farvardin, 1990) and (Luttrell, 1989) of jointly optimizing the codebook and the data representation w.r.t. to a given channel noise we apply a deterministic annealingscheme (Rose, 1990; Buhmann, 1997) to the problem and develop a An Annealed Self-Organizing Map for Source Channel Coding 431 soft topographic vector quantization algorithm (STVQ) (cf.



Multi-time Models for Temporally Abstract Planning

Neural Information Processing Systems

The Natural abstract actions are to move from room to room. 1 Reinforcement Learning (MDP) Framework In reinforcement learning, a learning agent interacts with an environment at some discrete, lowest-level time scale t 0,1,2, ... On each time step, the agent perceives the state of the environment, St, and on that basis chooses a primitive action, at.


An Incremental Nearest Neighbor Algorithm with Queries

Neural Information Processing Systems

We consider the general problem of learning multi-category classification fromlabeled examples. We present experimental results for a nearest neighbor algorithm which actively selects samples from different pattern classes according to a querying rule instead of the a priori class probabilities.


A Model of Early Visual Processing

Neural Information Processing Systems

We propose a model for early visual processing in primates. The model consists of a population of linear spatial filters which interact throughnon-linear excitatory and inhibitory pooling. Statistical estimation theory is then used to derive human psychophysical thresholds from the responses of the entire population of units. The model is able to reproduce human thresholds for contrast and orientation discriminationtasks, and to predict contrast thresholds in the presence of masks of varying orientation and spatial frequency.


RCC Cannot Compute Certain FSA, Even with Arbitrary Transfer Functions

Neural Information Processing Systems

The proof given here shows that for any finite, discrete transfer function used by the units of an RCC network, there are finite-state automata (FSA) that the network cannot model, no matter how many units are used. The proof also applies to continuous transfer functions with a finite number of fixed-points, such as sigmoid and radial-basis functions.