Plotting

 Country


Network Model of State-Dependent Sequencing

Neural Information Processing Systems

A network model with temporal sequencing and state-dependent modulatory featuresis described. The model is motivated by neurocognitive data characterizing different states of waking and sleeping. Computer studies demonstrate how unique states of sequencing can exist within the same network under different aminergic and cholinergic modulatory influences. Relationships between state-dependent modulation, memory, sequencing and learning are discussed.




Principles of Risk Minimization for Learning Theory

Neural Information Processing Systems

Learning is posed as a problem of function estimation, for which two principles ofsolution are considered: empirical risk minimization and structural risk minimization. These two principles are applied to two different statements ofthe function estimation problem: global and local. Systematic improvements in prediction power are illustrated in application to zip-code recognition.


Unsupervised Classifiers, Mutual Information and 'Phantom Targets

Neural Information Processing Systems

We derive criteria for training adaptive classifier networks to perform unsupervised dataanalysis. The first criterion turns a simple Gaussian classifier into a simple Gaussian mixture analyser. The second criterion, which is much more generally applicable, is based on mutual information.


A Segment-Based Automatic Language Identification System

Neural Information Processing Systems

Automatic language identification is the rapid automatic determination of the language beingspoken, by any speaker, saying anything. Despite several important applications of automatic language identification, this area has suffered from a lack of basic research and the absence of a standardized, public-domain database of languages. It is well known that languages have characteristic sound patterns. Languages have been described subjectively as "singsong", "rhythmic", "guttural", "nasal" etc. The key to solving the problem of automatic language identification is the detection and exploitation of such differences between languages. We assume that each language in the world has a unique acoustic structure, and that this structure can be defined in terms of phonetic and prosodic features of speech.


Some Approximation Properties of Projection Pursuit Learning Networks

Neural Information Processing Systems

Ying Zhao Christopher G. Atkeson The Artificial Intelligence Laboratory Massachusetts Institute of Technology Cambridge, MA 02139 Abstract This paper will address an important question in machine learning: What kind of network architectures work better on what kind of problems? A projection pursuit learning network has a very similar structure to a one hidden layer sigmoidal neural network. A general method based on a continuous version of projection pursuit regression is developed to show that projection pursuit regression works better on angular smooth functions thanon Laplacian smooth functions. There exists a ridge function approximation scheme to avoid the curse of dimensionality for approximating functionsin L 2(¢d). 1 INTRODUCTION Projection pursuit is a nonparametric statistical technique to find "interesting" low dimensional projections of high dimensional data sets. It has been used for nonparametric fitting and other data-analytic purposes (Friedman and Stuetzle, 1981, Huber, 1985).


Information Measure Based Skeletonisation

Neural Information Processing Systems

Automatic determination of proper neural network topology by trimming oversized networks is an important area of study, which has previously been addressed using a variety of techniques. In this paper, we present Information Measure Based Skeletonisation (IMBS), a new approach to this problem where superfluous hidden units are removed based on their information measure (1M). This measure, borrowed from decision tree induction techniques,reflects the degree to which the hyperplane formed by a hidden unit discriminates between training data classes. We show the results of applying IMBS to three classification tasks and demonstrate that it removes a substantial number of hidden units without significantly affecting network performance.


ANN Based Classification for Heart Defibrillators

Neural Information Processing Systems

Thesedevices are implanted and perform three types of actions: l.monitor the heart 2.to pace the heart 3.to apply high energy/high voltage electric shock 1bey sense the electrical activity of the heart through leads attached to the heart tissue. Two types of sensing are commooly used: Single Chamber: Lead attached to the Right Ventricular Apex (RVA) Dual Chamber: An additional lead is attached to the High Right Atrium (HRA). The actions performed by defibrillators are based on the outcome of a classification procedure based on the heart rhythms of different heart diseases (abnormal rhythms or "arrhythmias").


Data Analysis using G/SPLINES

Neural Information Processing Systems

G/SPLINES is an algorithm for building functional models of data. It uses genetic search to discover combinations of basis functions which are then used to build a least-squares regression model. Because it produces a population of models which evolve over time rather than a single model, it allows analysis not possible with other regression-based approaches. 1 INTRODUCTION G/SPLINES is a hybrid of Friedman's Multivariable Adaptive Regression Splines (MARS) algorithm (Friedman, 1990) with Holland's Genetic Algorithm (Holland, 1975). G/SPLINES has advantages over MARS in that it requires fewer least-squares computations, is easily extendable to non-spline basis functions, may discover models inaccessible to local-variable selection algorithms, and allows significantly larger problems to be considered. These issues are discussed in (Rogers, 1991). This paper begins with a discussion of linear regression models, followed by a description of the G/SPLINES algorithm, and finishes with a series of experiments illustrating its performance, robustness, and analysis capabilities.