Goto

Collaborating Authors

 Information Technology


A Hybrid Linear/Nonlinear Approach to Channel Equalization Problems

Neural Information Processing Systems

Channel equalization problem is an important problem in high-speed communications. The sequences of symbols transmitted are distorted by neighboring symbols. Traditionally, the channel equalization problem is considered as a channel-inversion operation. One problem of this approach is that there is no direct correspondence between error probability and residual error produced by the channel inversion operation. In this paper, the optimal equalizer design is formulated as a classification problem. The optimal classifier can be constructed by Bayes decision rule. In general it is nonlinear. An efficient hybrid linear/nonlinear equalizer approach has been proposed to train the equalizer. The error probability of new linear/nonlinear equalizer has been shown to be better than a linear equalizer in an experimental channel. 1 INTRODUCTION


Rational Parametrizations of Neural Networks

Neural Information Processing Systems

IR is typically a sigmoidal function such as (1.2) but other choices than (1.2) are possible and of interest.


The Power of Approximating: a Comparison of Activation Functions

Neural Information Processing Systems

We compare activation functions in terms of the approximation power of their feedforward nets. We consider the case of analog as well as boolean input. 1 Introduction


Learning to categorize objects using temporal coherence

Neural Information Processing Systems

The invariance of an objects' identity as it transformed over time provides a powerful cue for perceptual learning. We present an unsupervised learning procedure which maximizes the mutual information between the representations adopted by a feed-forward network at consecutive time steps. We demonstrate that the network can learn, entirely unsupervised, to classify an ensemble of several patterns by observing pattern trajectories, even though there are abrupt transitions from one object to another between trajectories. The same learning procedure should be widely applicable to a variety of perceptual learning tasks. 1 INTRODUCTION A promising approach to understanding human perception is to try to model its developmental stages. There is ample evidence that much of perception is learned.


Harmonic Grammars for Formal Languages

Neural Information Processing Systems

Basic connectionist principles imply that grammars should take the form of systems of parallel soft constraints defining an optimization problem the solutions to which are the well-formed structures in the language. Such Harmonic Grammars have been successfully applied to a number of problems in the theory of natural languages. Here it is shown that formal languages too can be specified by Harmonic Grammars, rather than by conventional serial rewrite rule systems. 1 HARMONIC GRAMMARS In collaboration with Geraldine Legendre, Yoshiro Miyata, and Alan Prince, I have been studying how symbolic computation in human cognition can arise naturally as a higher-level virtual machine realized in appropriately designed lower-level connectionist networks. The basic computational principles of the approach are these: (1) a. \Vhell analyzed at the lower level, mental representations are distributed patterns of connectionist activity; when analyzed at a higher level, these same representations constitute symbolic structures.


Forecasting Demand for Electric Power

Neural Information Processing Systems

Our efforts proceed in the context of a problem suggested by the operational needs of a particular electric utility to make daily forecasts of short-term load or demand. Forecasts are made at midday (1 p.m.) on a weekday t ( Monday - Thursday), for the next evening peak e(t) (occuring usually about 8 p.m. in the winter), the daily minimum d(t


Adaptive Stimulus Representations: A Computational Theory of Hippocampal-Region Function

Neural Information Processing Systems

We present a theory of cortico-hippocampal interaction in discrimination learning. The hippocampal region is presumed to form new stimulus representations which facilitate learning by enhancing the discriminability of predictive stimuli and compressing stimulus-stimulus redundancies. The cortical and cerebellar regions, which are the sites of long-term memory.


Performance Through Consistency: MS-TDNN's for Large Vocabulary Continuous Speech Recognition

Neural Information Processing Systems

Connectionist Rpeech recognition systems are often handicapped by an inconsistency between training and testing criteria. This problem is addressed by the Multi-State Time Delay Neural Network (MS-TDNN), a hierarchical phonf'mp and word classifier which uses DTW to modulate its connectivit.y


Recognition-based Segmentation of On-Line Hand-printed Words

Neural Information Processing Systems

The input strings consist of a timeordered sequence of XY coordinates, punctuated by pen-lifts. The methods were designed to work in "run-on mode" where there is no constraint on the spacing between characters. While both methods use a neural network recognition engine and a graph-algorithmic post-processor, their approaches to segmentation are quite different. The first method, which we call IN SEC (for input segmentation), uses a combination of heuristics to identify particular penlifts as tentative segmentation points. The second method, which we call OUTSEC (for output segmentation), relies on the empirically trained recognition engine for both recognizing characters and identifying relevant segmentation points.


Computing with Almost Optimal Size Neural Networks

Neural Information Processing Systems

Artificial neural networks are comprised of an interconnected collection of certain nonlinear devices; examples of commonly used devices include linear threshold elements, sigmoidal elements and radial-basis elements. We employ results from harmonic analysis and the theory of rational approximation to obtain almost tight lower bounds on the size (i.e.