Country
A Reinforcement Learning Algorithm in Partially Observable Environments Using Short-Term Memory
Suematsu, Nobuo, Hayashi, Akira
Since BLHT learns a stochastic model based on Bayesian Learning, the overfitting problemis reasonably solved. Moreover, BLHT has an efficient implementation. This paper shows that the model learned by BLHT converges toone which provides the most accurate predictions of percepts and rewards, given short-term memory. 1 INTRODUCTION Research on Reinforcement Learning (RL) problem forpartially observable environments is gaining more attention recently. This is mainly because the assumption that perfect and complete perception of the state of the environment is available for the learning agent, which many previous RL algorithms require, is not valid for many realistic environments.
Visualizing Group Structure
Held, Marcus, Puzicha, Jan, Buhmann, Joachim M.
Cluster analysis is a fundamental principle in exploratory data analysis, providing the user with a description of the group structure ofgiven data. A key problem in this context is the interpretation andvisualization of clustering solutions in high-dimensional or abstract data spaces. In particular, probabilistic descriptions of the group structure, essential to capture inter-cluster relationships, arehardly assessable by simple inspection ofthe probabilistic assignment variables. VVe present a novel approach to the visualization ofgroup structure. It is based on a statistical model of the object assignments which have been observed or estimated by a probabilistic clustering procedure. The objects or data points are embedded in a low dimensional Euclidean space by approximating the observed data statistics with a Gaussian mixture model. The algorithm provides a new approach to the visualization of the inherent structurefor a broad variety of data types, e.g.
Bayesian PCA
The technique of principal component analysis (PCA) has recently been expressed as the maximum likelihood solution for a generative latent variable model. In this paper we use this probabilistic reformulation as the basis for a Bayesian treatment of PCA. Our key result is that effective dimensionalityof the latent space (equivalent to the number of retained principal components) can be determined automatically as part of the Bayesian inference procedure. An important application of this framework is to mixtures of probabilistic PCA models, in which each component can determine its own effective complexity. 1 Introduction Principal component analysis (PCA) is a widely used technique for data analysis. Recently Tipping and Bishop (1997b) showed that a specific form of generative latent variable model has the property that its maximum likelihood solution extracts the principal subspace of the observed data set.
Perceiving without Learning: From Spirals to Inside/Outside Relations
As a benchmark task, the spiral problem is well known in neural networks. Unlikeprevious work that emphasizes learning, we approach the problem from a generic perspective that does not involve learning. We point out that the spiral problem is intrinsically connected to the inside/outside problem.A generic solution to both problems is proposed based on oscillatory correlation using a time delay network. Our simulation resultsare qualitatively consistent with human performance, and we interpret human limitations in terms of synchrony and time delays, both biologically plausible. As a special case, our network without time delays can always distinguish these figures regardless of shape, position, size, and orientation.
Learning Multi-Class Dynamics
Blake, Andrew, North, Ben, Isard, Michael
Yule-Walker) are available for learning Auto-Regressive process models of simple, directly observable, dynamical processes.When sensor noise means that dynamics are observed only approximately, learning can still been achieved via Expectation-Maximisation (EM) together with Kalman Filtering. However, this does not handle more complex dynamics, involving multiple classes of motion.
Convergence of the Wake-Sleep Algorithm
Ikeda, Shiro, Amari, Shun-ichi, Nakahara, Hiroyuki
The WS (Wake-Sleep) algorithm is a simple learning rule for the models with hidden variables. It is shown that this algorithm can be applied to a factor analysis model which is a linear version of the Helmholtz machine. Buteven for a factor analysis model, the general convergence is not proved theoretically.
Computational Differences between Asymmetrical and Symmetrical Networks
However, because ofthe separation between excitation and inhibition, biological neural networks are asymmetrical. We study characteristic differences between asymmetrical networks and their symmetrical counterparts,showing that they have dramatically different dynamical behavior and also how the differences can be exploited for computational ends. We illustrate our results in the case of a network that is a selective amplifier.
Exploiting Generative Models in Discriminative Classifiers
Jaakkola, Tommi, Haussler, David
On the other hand, discriminative methods such as support vector machines enable us to construct flexible decision boundaries and often result in classification performance superiorto that of the model based approaches. An ideal classifier should combine these two complementary approaches. In this paper, we develop a natural way of achieving this combination byderiving kernel functions for use in discriminative methods such as support vector machines from generative probability models.