Country
Transient Signal Detection with Neural Networks: The Search for the Desired Signal
Príncipe, José Carlos, Zahalka, Abir
Matched filtering has been one of the most powerful techniques employed for transient detection. Here we will show that a dynamic neural network outperforms the conventional approach. When the artificial neural network (ANN) is trained with supervised learning schemes there is a need to supply the desired signal for all time, although we are only interested in detecting the transient. In this paper we also show the effects on the detection agreement of different strategies to construct the desired signal. The extension of the Bayes decision rule (011 desired signal), optimal in static classification, performs worse than desired signals constructed by random noise or prediction during the background. 1 INTRODUCTION Detection of poorly defined waveshapes in a nonstationary high noise background is an important and difficult problem in signal processing.
A Method for Learning From Hints
We address the problem of learning an unknown function by pu tting together several pieces of information (hints) that we know about the function. We introduce a method that generalizes learning fromexamples to learning from hints. A canonical representation ofhints is defined and illustrated for new types of hints. All the hints are represented to the learning process by examples, and examples of the function are treated on equal footing with the rest of the hints. During learning, examples from different hints are selected for processing according to a given schedule. We present two types of schedules; fixed schedules that specify the relative emphasis ofeach hint, and adaptive schedules that are based on how well each hint has been learned so far. Our learning method is compatible with any descent technique that we may choose to use.
Learning Curves, Model Selection and Complexity of Neural Networks
Murata, Noboru, Yoshizawa, Shuji, Amari, Shun-ichi
Learning curves show how a neural network is improved as the number of t.raiuing examples increases and how it is related to the network complexity. The present paper clarifies asymptotic properties and their relation of t.wo learning curves, one concerning the predictive loss or generalization loss and the other the training loss. The result gives a natural definition of the complexity of a neural network. Moreover, it provides a new criterion of model selection.
Integration of Visual and Somatosensory Information for Preshaping Hand in Grasping Movements
Uno, Yoji, Fukumura, Naohiro, Suzuki, Ryoji, Kawato, Mitsuo
The primate brain must solve two important problems in grasping movements. Thefirst problem concerns the recognition of grasped objects: specifically, how does the brain integrate visual and motor information on a grasped object? The second problem concerns hand shape planning: specifically, how does the brain design the hand configuration suited to the shape of the object and the manipulation task? A neural network model that solves these problems has been developed.
A Formal Model of the Insect Olfactory Macroglomerulus: Simulations and Analytic Results
Linster, Christiane, Marsan, David, Masson, Claudine, Kerszberg, Michel, Dreyfus, Gérard, Personnaz, Léon
It is known from biological data that the response patterns of interneurons in the olfactory macroglomerulus (MGC) of insects are of central importance for the coding of the olfactory signal. We propose an analytically tractable model of the MGC which allows us to relate the distribution of response patterns to the architecture of the network.
A Boundary Hunting Radial Basis Function Classifier which Allocates Centers Constructively
Chang, Eric I., Lippmann, Richard P.
A new boundary hunting radial basis function (BH-RBF) classifier which allocates RBF centers constructively near class boundaries is described. This classifier creates complex decision boundaries only in regions where confusions occur and corresponding RBF outputs are similar. A predicted square error measure is used to determine how many centers to add and to determine when to stop adding centers. Two experiments are presented which demonstrate the advantages of the BH RBF classifier. One uses artificial data with two classes and two input features where each class contains four clusters but only one cluster is near a decision region boundary.
Physiologically Based Speech Synthesis
Hirayama, Makoto, Vatikiotis-Bateson, Eric, Honda, Kiyoshi, Koike, Yasuharu, Kawato, Mitsuo
This study demonstrates a paradigm for modeling speech production basedon neural networks. Using physiological data from speech utterances, a neural network learns the forward dynamics relating motor commands to muscles and the ensuing articulator behavior that allows articulator trajectories to be generated from motor commands constrained by phoneme input strings and global performance parameters. From these movement trajectories, a second neuralnetwork generates PARCOR parameters that are then used to synthesize the speech acoustics.
Weight Space Probability Densities in Stochastic Learning: I. Dynamics and Equilibria
The ensemble dynamics of stochastic learning algorithms can be studied using theoretical techniques from statistical physics. We develop the equations of motion for the weight space probability densities for stochastic learning algorithms. We discuss equilibria in the diffusion approximation and provide expressions for special cases of the LMS algorithm. The equilibrium densities are not in general thermal (Gibbs) distributions in the objective function being minimized,but rather depend upon an effective potential that includes diffusion effects. Finally we present an exact analytical expression for the time evolution of the density for a learning algorithm withweight updates proportional to the sign of the gradient.
Attractor Neural Networks with Local Inhibition: from Statistical Physics to a Digitial Programmable Integrated Circuit
In particular the critical capacity of the network is increased as well as its capability to store correlated patterns. Chaotic dynamic behaviour(exponentially long transients) of the devices indicates theoverloading of the associative memory. An implementation based on a programmable logic device is here presented. A 16 neurons circuit is implemented whit a XILINK 4020 device. The peculiarity of this solution is the possibility to change parts of the project (weights, transfer function or the whole architecture) with a simple software download of the configuration into the XILINK chip. 1 INTRODUCTION Attractor Neural Networks endowed with local inhibitory feedbacks, have been shown to have interesting computational performances[I].