Plotting

Noisy Spiking Neurons with Temporal Coding have more Computational Power than Sigmoidal Neurons

Neural Information Processing Systems

Furthermore it is shown that networks of noisy spiking neurons with temporal coding have a strictly larger computational power than sigmoidal neural nets with the same number of units. 1 Introduction and Definitions We consider a formal model SNN for a ยงpiking neuron network that is basically a reformulation of the spike response model (and of the leaky integrate and fire model) without using 6-functions (see [Maass, 1996a] or [Maass, 1996b] for further backgrou nd).


Neural Network Models of Chemotaxis in the Nematode Caenorhabditis Elegans

Neural Information Processing Systems

We train recurrent networks to control chemotaxis in a computer model of the nematode C. elegans. The model presented is based closely on the body mechanics, behavioral analyses, neuroanatomy and neurophysiology of C. elegans, each imposing constraints relevant for information processing. Simulated worms moving autonomously in simulated chemical environments display a variety of chemotaxis strategies similar to those of biological worms. 1 INTRODUCTION The nematode C. elegans provides a unique opportunity to study the neuronal basis of neural computation in an animal capable of complex goal-oriented behaviors. The adult hermaphrodite is only 1 mm long, and has exactly 302 neurons and 95 muscle cells. The morphology of every cell and the location of most electrical and chemical synapses are known precisely (White et al., 1986), making C. elegans especially attractive for study.


Statistically Efficient Estimations Using Cortical Lateral Connections

Neural Information Processing Systems

Coarse codes are widely used throughout the brain to encode sensory and motor variables. Methods designed to interpret these codes, such as population vector analysis, are either inefficient, i.e., the variance of the estimate is much larger than the smallest possible variance, or biologically implausible, like maximum likelihood. Moreover, these methods attempt to compute a scalar or vector estimate of the encoded variable. Neurons are faced with a similar estimation problem. They must read out the responses of the presynaptic neurons, but, by contrast, they typically encode the variable with a further population code rather than as a scalar. We show how a nonlinear recurrent network can be used to perform these estimation in an optimal way while keeping the estimate in a coarse code format. This work suggests that lateral connections in the cortex may be involved in cleaning up uncorrelated noise among neurons representing similar variables.


Multilayer Neural Networks: One or Two Hidden Layers?

Neural Information Processing Systems

In dimension d 2, Gibson characterized the functions computable with just one hidden layer, under the assumption that there is no "multiple intersection point" and that f


Reinforcement Learning for Mixed Open-loop and Closed-loop Control

Neural Information Processing Systems

Closed-loop control relies on sensory feedback that is usually assumed to be free. But if sensing incurs a cost, it may be costeffective to take sequences of actions in open-loop mode. We describe a reinforcement learning algorithm that learns to combine open-loop and closed-loop control when sensing incurs a cost. Although we assume reliable sensors, use of open-loop control means that actions must sometimes be taken when the current state of the controlled system is uncertain. This is a special case of the hidden-state problem in reinforcement learning, and to cope, our algorithm relies on short-term memory. The main result of the paper is a rule that significantly limits exploration of possible memory states by pruning memory states for which the estimated value of information is greater than its cost. We prove that this rule allows convergence to an optimal policy.


Complex-Cell Responses Derived from Center-Surround Inputs: The Surprising Power of Intradendritic Computation

Neural Information Processing Systems

Biophysical modeling studies have previously shown that cortical pyramidal cells driven by strong NMDA-type synaptic currents and/or containing dendritic voltage-dependent Ca or Na channels, respond more strongly when synapses are activated in several spatially clustered groups of optimal size-in comparison to the same number of synapses activated diffusely about the dendritic arbor [8]- The nonlinear intradendritic interactions giving rise to this "cluster sensitivity" property are akin to a layer of virtual nonlinear "hidden units" in the dendrites, with implications for the cellular basis of learning and memory [7, 6], and for certain classes of nonlinear sensory processing [8]- In the present study, we show that a single neuron, with access only to excitatory inputs from unoriented ONand OFFcenter cells in the LGN, exhibits the principal nonlinear response properties of a "complex" cell in primary visual cortex, namely orientation tuning coupled with translation invariance and contrast insensitivity_ We conjecture that this type of intradendritic processing could explain how complex cell responses can persist in the absence of oriented simple cell input [13]- 84 B. W. Mel, D. L. Ruderman and K. A. Archie


Representation and Induction of Finite State Machines using Time-Delay Neural Networks

Neural Information Processing Systems

This work investigates the representational and inductive capabilities of time-delay neural networks (TDNNs) in general, and of two subclasses of TDNN, those with delays only on the inputs (IDNN), and those which include delays on hidden units (HDNN). Both architectures are capable of representing the same class of languages, the definite memory machine (DMM) languages, but the delays on the hidden units in the HDNN helps it outperform the IDNN on problems composed of repeated features over short time windows.


Separating Style and Content

Neural Information Processing Systems

We seek to analyze and manipulate two factors, which we call style and content, underlying a set of observations. We fit training data with bilinear models which explicitly represent the two-factor structure. These models can adapt easily during testing to new styles or content, allowing us to solve three general tasks: extrapolation of a new style to unobserved content; classification of content observed in a new style; and translation of new content observed in a new style.


Exploiting Model Uncertainty Estimates for Safe Dynamic Control Learning

Neural Information Processing Systems

Figure 2: The task is to move the cart to the origin as quickly as possible without dropping the pole. The bottom three pictures show a trace of the policy execution obtained after one, two, and three trials (shown in increments of 0.5 seconds) Controller Number of data points used Cost from initial state 17 to build the controller LQR


Clustering via Concave Minimization

Neural Information Processing Systems

There are many approaches to this problem, including statistical [9], machine learning [7], integer and mathematical programming [18,1]. In this paper we concentrate on a simple concave minimization formulation of the problem that leads to a finite and fast algorithm.