Not enough data to create a plot.
Try a different view from the menu above.
Weight Space Probability Densities in Stochastic Learning: I. Dynamics and Equilibria
The ensemble dynamics of stochastic learning algorithms can be studied using theoretical techniques from statistical physics. We develop the equations of motion for the weight space probability densities for stochastic learning algorithms. We discuss equilibria in the diffusion approximation and provide expressions for special cases of the LMS algorithm. The equilibrium densities are not in general thermal (Gibbs) distributions in the objective function being minimized,but rather depend upon an effective potential that includes diffusion effects. Finally we present an exact analytical expression for the time evolution of the density for a learning algorithm withweight updates proportional to the sign of the gradient.
Attractor Neural Networks with Local Inhibition: from Statistical Physics to a Digitial Programmable Integrated Circuit
In particular the critical capacity of the network is increased as well as its capability to store correlated patterns. Chaotic dynamic behaviour(exponentially long transients) of the devices indicates theoverloading of the associative memory. An implementation based on a programmable logic device is here presented. A 16 neurons circuit is implemented whit a XILINK 4020 device. The peculiarity of this solution is the possibility to change parts of the project (weights, transfer function or the whole architecture) with a simple software download of the configuration into the XILINK chip. 1 INTRODUCTION Attractor Neural Networks endowed with local inhibitory feedbacks, have been shown to have interesting computational performances[I].
Recognition-based Segmentation of On-Line Hand-printed Words
Schenkel, M., Weissman, H., Guyon, I., Nohl, C., Henderson, D.
The input strings consist of a timeordered sequenceof XY coordinates, punctuated by pen-lifts. The methods were designed to work in "run-on mode" where there is no constraint on the spacing between characters. While both methods use a neural network recognition engine and a graph-algorithmic post-processor, their approaches to segmentation are quite different. Thefirst method, which we call IN SEC (for input segmentation), usesa combination of heuristics to identify particular penlifts as tentative segmentation points. The second method, which we call OUTSEC (for output segmentation), relies on the empirically trainedrecognition engine for both recognizing characters and identifying relevant segmentation points. 1 INTRODUCTION We address the problem of writer independent recognition of hand-printed words from an 80,OOO-word English dictionary. Several levels of difficulty in the recognition of hand-printed words are illustrated in figure 1. The examples were extracted from our databases (table 1). Except in the cases of boxed or clearly spaced characters, segmenting characters independently of the recognition process yields poor recognition performance.This has motivated us to explore recognition-based segmentation techniques.
Metamorphosis Networks: An Alternative to Constructive Models
Bonnlander, Brian V., Mozer, Michael C.
Given a set oftraining examples, determining the appropriate number offree parameters is a challenging problem. Constructive learning algorithms attempt to solve this problem automatically by adding hidden units, and therefore free parameters, during learning. Weexplore an alternative class of algorithms-called metamorphosis algorithms-inwhich the number of units is fixed, but the number of free parameters gradually increases during learning. The architecture we investigate is composed of RBF units on a lattice, whichimposes flexible constraints on the parameters of the network. Virtues of this approach include variable subset selection, robustparameter selection, multiresolution processing, and interpolation of sparse training data.
Learning Sequential Tasks by Incrementally Adding Higher Orders
An incremental, higher-order, non-recurrent network combines two properties found to be useful for learning sequential tasks: higherorder connectionsand incremental introduction of new units. The network adds higher orders when needed by adding new units that dynamically modify connection weights. Since the new units modify theweights at the next time-step with information from the previous step, temporal tasks can be learned without the use of feedback, thereby greatly simplifying training. Furthermore, a theoretically unlimitednumber of units can be added to reach into the arbitrarily distant past.
Some Solutions to the Missing Feature Problem in Vision
In visual processing the ability to deal with missing and noisy information iscrucial. Occlusions and unreliable feature detectors often lead to situations where little or no direct information about features is available. Howeverthe available information is usually sufficient to highly constrain the outputs. We discuss Bayesian techniques for extracting class probabilities given partial data. The optimal solution involves integrating overthe missing dimensions weighted by the local probability densities. We show how to obtain closed-form approximations to the Bayesian solution using Gaussian basis function networks.
Hidden Markov Models in Molecular Biology: New Algorithms and Applications
Baldi, Pierre, Chauvin, Yves, Hunkapiller, Tim, McClure, Marcella A.
Hidden Markov Models (HMMs) can be applied to several important problemsin molecular biology. We introduce a new convergent learning algorithm for HMMs that, unlike the classical Baum-Welch algorithm is smooth and can be applied online or in batch mode, with or without the usual Viterbi most likely path approximation. Left-right HMMs with insertion and deletion states are then trained to represent several protein families including immunoglobulins and kinases. In all cases, the models derived capture all the important statistical properties of the families and can be used efficiently in a number of important tasks such as multiple alignment, motif detection, andclassification.