Plotting

 Information Technology


Applications of Error Back-Propagation to Phonetic Classification

Neural Information Processing Systems

This paper is concerced with the use of error back-propagation in phonetic classification. Our objective is to investigate the basic characteristicsof back-propagation, and study how the framework ofmulti-layer perceptrons can be exploited in phonetic recognition.


Neural Network Star Pattern Recognition for Spacecraft Attitude Determination and Control

Neural Information Processing Systems

Phillip Alvelda, A. Miguel San Martin The Jet Propulsion Laboratory, California Institute of Technology, Pasadena, Ca. 91109 ABSTRACT Currently, the most complex spacecraft attitude determination and control tasks are ultimately governed by ground-based systems and personnel. Conventional on-board systems face severe computational bottlenecks introduced by serial microprocessors operating on inherently parallel problems. New computer architectures based on the anatomy of the human brain seem to promise high speed and fault-tolerant solutions to the limitations of serial processing. INTRODUCTION By design, a conventional on-board microprocessor can perform only one comparison or calculation at a time. Image or pattern recognition problems involving large template sets and high resolution can require an astronomical number of comparisons to a given database.


Use of Multi-Layered Networks for Coding Speech with Phonetic Features

Neural Information Processing Systems

McGill University Montreal, Canada H3A2A7 PieroCosi Centro di Studio per Ie Ricerche di Fonetica, C.N.R., Via Oberdan,10, 35122 Padova, Italy ABSTRACT Preliminary results on speaker-independant speech recognition are reported. A method that combines expertise on neural networks with expertise on speech recognition is used to build the recognition systems. For transient sounds, eventdriven propertyextractors with variable resolution in the time and frequency domains are used. For sonorant speech, a model of the human auditory system is preferred to FFT as a front-end module. INTRODUCTION Combining a structural or knowledge-based approach for describing speech units with neural networks capable of automatically learning relations between acoustic properties and speech units is the research effort we are attempting.



Training a 3-Node Neural Network is NP-Complete

Neural Information Processing Systems

We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear threshold functions of their inputs. We show that it is NPcomplete to decide whether there exist weights and thresholds for the three nodes of this network so that it will produce output consistent witha given set of training examples. We extend the result to other simple networks. This result suggests that those looking for perfect training algorithms cannot escape inherent computational difficulties just by considering only simple or very regular networks. It also suggests the importance, given a training problem, of finding an appropriate network and input encoding for that problem. It is left as an open problem to extend our result to nodes with nonlinear functions such as sigmoids.


Adaptive Neural Net Preprocessing for Signal Detection in Non-Gaussian Noise

Neural Information Processing Systems

A nonlinearity is required before matched filtering in mInimum error receivers when additive noise is present which is impulsive and highly non-Gaussian. Experiments were performed to determine whether the correct clipping nonlinearity could be provided by a single-input singleoutput multi-layerperceptron trained with back propagation. It was found that a multi-layer perceptron with one input and output node, 20 nodes in the first hidden layer, and 5 nodes in the second hidden layer could be trained to provide a clipping nonlinearity with fewer than 5,000 presentations of noiseless and corrupted waveform samples. A network trained at a relatively high signal-to-noise (SIN) ratio and then used as a front end for a linear matched filter detector greatly reduced the probability of error. The clipping nonlinearity formed by this network was similar to that used in current receivers designed for impulsive noise and provided similar substantial improvements in performance.


Training a Limited-Interconnect, Synthetic Neural IC

Neural Information Processing Systems

Hardware implementation of neuromorphic algorithms is hampered by high degrees of connectivity. Functionally equivalent feedforward networks may be formed by using limited fan-in nodes and additional layers.


Temporal Representations in a Connectionist Speech System

Neural Information Processing Systems

Erich J. Smythe 207 Greenmanville Ave, #6 Mystic, CT 06355 ABSTRACT SYREN is a connectionist model that uses temporal information in a speech signal for syllable recognition. It classifies the rates and directions of formant center transitions, and uses an adaptive method to associate transition events with each syllable. The system uses explicit spatial temporal representations through delay lines.SYREN uses implicit parametric temporal representations informant transition classification through node activation onset, decay, and transition delays in sub-networks analogous to visual motion detector cells. SYREN recognizes 79% of six repetitions of24 consonant-vowel syllables when tested on unseen data, and recognizes 100% of its training syllables. INTRODUCTION Living organisms exist in a dynamic environment. Problem solving systems, both natural and synthetic, must relate and interpret events that occur over time. Although connectionist models are based on metaphors from the brain, few have been designed to capture temporal and sequential information common to even the most primitive nervous systems.


An Adaptive Network That Learns Sequences of Transitions

Neural Information Processing Systems

We describe an adaptive network, TIN2, that learns the transition function of a sequential system from observations of its behavior.


Fixed Point Analysis for Recurrent Networks

Neural Information Processing Systems

This paper provides a systematic analysis of the recurrent backpropagation (RBP)algorithm, introducing a number of new results. The main limitation of the RBP algorithm is that it assumes the convergence of the network to a stable fixed point in order to backpropagate the error signals. We show by experiment and eigenvalue analysis that this condition canbe violated and that chaotic behavior can be avoided. Next we examine the advantages of RBP over the standard backpropagation algorithm. RBPis shown to build stable fixed points corresponding to the input patterns. This makes it an appropriate tool for content addressable memories,one-to-many function learning, and inverse problems.