Goto

Collaborating Authors

 Information Technology


Statistical Mechanics of Temporal Association in Neural Networks

Neural Information Processing Systems

Basic computational functions of associative neural structures may be analytically studied within the framework of attractor neural networks where static patterns are stored as stable fixed-points for the system's dynamics. If the interactions between single neurons are instantaneous and mediated by symmetric couplings, there is a Lyapunov function for the retrieval dynamics (Hopfield 1982). The global computation corresponds in that case to a downhill motion in an energy landscape created by the stored information. Methods of equilibrium statistical mechanics may be applied and permit a quantitative analysis of the asymptotic network behavior (Amit et al. 1985, 1987). The existence of a Lyapunov function is thus of great conceptual as well as technical importance. Nevertheless, one should be aware that environmental inputs to a neural net always provide information in both space and time. It is therefore desirable to extend the original Hopfield scheme and to explore possibilities for a joint representation of static patterns and temporal associations.


A Method for the Efficient Design of Boltzmann Machines for Classiffication Problems

Neural Information Processing Systems

A Boltzmann machine ([AHS], [HS], [AK]) is a neural network model in which the units update their states according to a stochastic decision rule. It consists of a set U of units, a set C of unordered pairs of elements of U, and an assignment of connection strengths S: C -- R. A configuration of a Boltzmann machine is a map k: U -- {O, I}.


Back Propagation Implementation on the Adaptive Solutions CNAPS Neurocomputer Chip

Neural Information Processing Systems

Most investigators have created large parallel computers or special purpose chips limited to a small subset of algorithms. The Adaptive Solutions CNAPS architecture describes a general-purpose 64-processor chip which supports on chip learning and is capable of implementing most current algorithms. Implementation of the popular Back Propagation (BP) algorithm will demonstrate the speed and versatility of this new chip.


e-Entropy and the Complexity of Feedforward Neural Networks

Neural Information Processing Systems

We are concerned with the problem of the number of nodes needed in a feedforward neural network in order to represent a fUllction to within a specified accuracy.


Learning Time-varying Concepts

Neural Information Processing Systems

This work extends computational learning theory to situations in which concepts vary over time, e.g., system identification of a time-varying plant. We have extended formal definitions of concepts and learning to provide a framework in which an algorithm can track a concept as it evolves over time. Given this framework and focusing on memory-based algorithms, we have derived some PACstyle sample complexity results that determine, for example, when tracking is feasible. We have also used a similar framework and focused on incremental tracking algorithms for which we have derived some bounds on the mistake or error rates for some specific concept classes. 1 INTRODUCTION The goal of our ongoing research is to extend computational learning theory to include concepts that can change or evolve over time. For example, face recognition is complicated by the fact that a persons face changes slowly with age and more quickly with changes in make up, hairstyle, or facial hair.



Connectionist Music Composition Based on Melodic and Stylistic Constraints

Neural Information Processing Systems

We describe a recurrent connectionist network, called CONCERT, that uses a set of melodies written in a given style to compose new melodies in that style. CONCERT is an extension of a traditional algorithmic composition technique in which transition tables specify the probability of the next note as a function of previous context. A central ingredient of CONCERT is the use of a psychologically-grounded representation of pitch.


A Framework for the Cooperation of Learning Algorithms

Neural Information Processing Systems

We introduce a framework for training architectures composed of several modules. This framework, which uses a statistical formulation of learning systems, provides a unique formalism for describing many classical connectionist algorithms as well as complex systems where several algorithms interact. It allows to design hybrid systems which combine the advantages of connectionist algorithms as well as other learning algorithms.


Spoken Letter Recognition

Neural Information Processing Systems

Through the use of neural network classifiers and careful feature selection, we have achieved high-accuracy speaker-independent spoken letter recognition. For isolated letters, a broad-category segmentation is performed Location of segment boundaries allows us to measure features at specific locations in the signal such as vowel onset, where important information resides. Letter classification is performed with a feed-forward neural network. Recognition accuracy on a test set of 30 speakers was 96%. Neural network classifiers are also used for pitch tracking and broad-category segmentation of letter strings.


Asymptotic slowing down of the nearest-neighbor classifier

Neural Information Processing Systems

M2/n' for sufficiently large values of M. Here, Poo(error) denotes the probability of error in the infinite sample limit, and is at most twice the error of a Bayes classifier. Although the value of the coefficient a depends upon the underlying probability distributions, the exponent of M is largely distribution free. We thus obtain a concise relation between a classifier's ability to generalize from a finite reference sample and the dimensionality of the feature space, as well as an analytic validation of Bellman's well known "curse of dimensionality." 1 INTRODUCTION One of the primary tasks assigned to neural networks is pattern classification. Common applications include recognition problems dealing with speech, handwritten characters, DNA sequences, military targets, and (in this conference) sexual identity. Two fundamental concepts associated with pattern classification are generalization (how well does a classifier respond to input data it has never encountered before?) and scalability (how are a classifier's processing and training requirements affected by increasing the number of features that describe the input patterns?).