Not enough data to create a plot.
Try a different view from the menu above.
Country
Bias-Driven Revision of Logical Domain Theories
Koppel, M., Feldman, R., Segre, A. M.
The theory revision problem is the problem of how best to go about revising a deficient domain theory using information contained in examples that expose inaccuracies. In this paper we present our approach to the theory revision problem for propositional domain theories. The approach described here, called PTR, uses probabilities associated with domain theory elements to numerically track the ``flow'' of proof through the theory. This allows us to measure the precise role of a clause or literal in allowing or preventing a (desired or undesired) derivation for a given example. This information is used to efficiently locate and repair flawed elements of the theory. PTR is proved to converge to a theory which correctly classifies all examples, and shown experimentally to be fast and accurate even for deep theories.
A Neural Model of Descending Gain Control in the Electrosensory System
Certain species of freshwater tropical fish, known as weakly electric fish, possess an active electric sense that allows them to detect and discriminate objects in their environment using a self-generated electric field (Bullock and Heiligenberg, 1986). They detect objects by sensing small perturbations in this electric field using an array of specialized receptors, known as electroreceptors, that cover their body surface. Weaklyelectric fish often live in turbid water and tend to be nocturnal. These conditions, which hinder visual perception, do not adversely affect the electric sense. Hence the electrosensory system allows these fish to navigate and capture prey in total darkness in much the same way as the sonar system of echolocating bats allows them to do the same.
How Oscillatory Neuronal Responses Reflect Bistability and Switching of the Hidden Assembly Dynamics
Pawelzik, K., Bauer, H.-U., Deppisch, J., Geisel, T.
A switching between apparently coherent (oscillatory) and stochastic episodes of activity has been observed in responses from cat and monkey visual cortex. We describe the dynamics of these phenomena in two parallel approaches,a phenomenological and a rather microscopic one. On the one hand we analyze neuronal responses in terms of a hidden state model (HSM). The parameters of this model are extracted directly from experimental spiketrains. They characterize the underlying dynamics as well as the coupling of individual neurons to the network. This phenomenological modelthus provides a new framework for the experimental analysis of network dynamics.
Topography and Ocular Dominance with Positive Correlations
This is motivated by experimental evidencethat these phenomena may be subserved by the same mechanisms. An important aspect of this model is that ocular dominance segregationcan occur when input activity is both distributed, and positively correlated between the eyes. This allows investigation of the dependence of the pattern of ocular dominance stripes on the degree of correlation between the eyes: it is found that increasing correlation leads to narrower stripes. Experiments are suggested to test whether such behaviour occursin the natural system.
Improving Performance in Neural Networks Using a Boosting Algorithm
Drucker, Harris, Schapire, Robert, Simard, Patrice
A boosting algorithm converts a learning machine with error rate less than 50% to one with an arbitrarily low error rate. However, the algorithm discussed here depends on having a large supply of independent training samples. We show how to circumvent this problem and generate an ensemble of learning machines whose performance in optical character recognition problems is dramatically improved over that of a single network. We report the effect of boosting on four databases (all handwritten) consisting of 12,000 digits from segmented ZIP codes from the United State Postal Service (USPS) and the following from the National Institute of Standards and Testing (NIST): 220,000 digits, 45,000 upper case alphas, and 45,000 lower case alphas. We use two performance measures: the raw error rate (no rejects) and the reject rate required to achieve a 1% error rate on the patterns not rejected.
A Neural Model of Descending Gain Control in the Electrosensory System
Certain species of freshwater tropical fish, known as weakly electric fish, possess an active electric sense that allows them to detect and discriminate objects in their environment using a self-generated electric field (Bullock and Heiligenberg, 1986). They detect objects by sensing small perturbations in this electric field using an array of specialized receptors, known as electroreceptors, that cover their body surface. Weakly electric fish often live in turbid water and tend to be nocturnal. These conditions, which hinder visual perception, do not adversely affect the electric sense. Hence the electrosensory system allows these fish to navigate and capture prey in total darkness in much the same way as the sonar system of echolocating bats allows them to do the same.
Adaptive Stimulus Representations: A Computational Theory of Hippocampal-Region Function
Gluck, Mark A., Myers, Catherine E.
We present a theory of cortico-hippocampal interaction in discrimination learning. The hippocampal region is presumed to form new stimulus representations which facilitate learning by enhancing the discriminability of predictive stimuli and compressing stimulus-stimulus redundancies. The cortical and cerebellar regions, which are the sites of long-term memory.
Weight Space Probability Densities in Stochastic Learning: I. Dynamics and Equilibria
The ensemble dynamics of stochastic learning algorithms can be studied using theoretical techniques from statistical physics. We develop the equations of motion for the weight space probability densities for stochastic learning algorithms. We discuss equilibria in the diffusion approximation and provide expressions for special cases of the LMS algorithm. The equilibrium densities are not in general thermal (Gibbs) distributions in the objective function being minimized, but rather depend upon an effective potential that includes diffusion effects. Finally we present an exact analytical expression for the time evolution of the density for a learning algorithm with weight updates proportional to the sign of the gradient.
Parameterising Feature Sensitive Cell Formation in Linsker Networks in the Auditory System
Walton, Lance C., Bisset, David L.
This paper examines and extends the work of Linsker (1986) on self organising feature detectors. Linsker concentrates on the visual processing system, but infers that the weak assumptions made will allow the model to be used in the processing of other sensory information. This claim is examined here, with special attention paid to the auditory system, where there is much lower connectivity and therefore more statistical variability. Online training is utilised, to obtain an idea of training times. These are then compared to the time available to prenatal mammals for the formation of feature sensitive cells. 1 INTRODUCTION Within the last thirty years, a great deal of research has been carried out in an attempt to understand the development of cells in the pathways between the sensory apparatus and the cortex in mammals. For example, theories for the development of feature detectors were forwarded by Nass and Cooper (1975), by Grossberg (1976) and more recently Obermayer et al (1990). Hubel and Wiesel (1961) established the existence of several different types of feature sensitive cell in the visual cortex of cats. Various subsequent experiments have 1007 1008 Walton and Bisset shown that a considerable amount of development takes place before birth (i.e.
Word Space
Representations for semantic information about words are necessary for many applications of neural networks in natural language processing. This paper describes an efficient, corpus-based method for inducing distributed semantic representations for a large number of words (50,000) from lexical coccurrence statistics by means of a large-scale linear regression. The representations are successfully applied to word sense disambiguation using a nearest neighbor method. 1 Introduction Many tasks in natural language processing require access to semantic information about lexical items and text segments.