A Probabilistic Calculus of Actions

AAAI Conferences

In planning, however, they are less popular, 1 partly due to the unsettled, strange relationship between probability and actions. In principle, actions are not part of standard probability theory, and understandably so: probabilities capture normal relationships in the world, while actions represent interventions that perturb those relationships. It is no wonder, then, that actions are treated as foreign entities throughout the literature on probability and statistics; they serve neither as arguments of probability expressions nor as events for conditioning such expressions. Even in the decision theoretic literature, where actions are the target of op-1Works by Dean & Kanazawa [1989] and Kushmerick et al. [1993] notwithstanding.


Execution Monitoring with Quantitative Temporal Bayesian Networks

AAAI Conferences

The goal of execution monitoring is to determine whether a system or person is following a plan appropriately. Monitoring information may be uncertain, and the plan being monitored may have complex temporal constraints. We develop a new framework for reasoning under uncertainty with quantitative temporal constraints - Quantitative Temporal Bayesian Networks - and we discuss its application to plan-execution monitoring. QTBNs extend the major previous approaches to temporal reasoning under uncertainty: Time Nets (Kanazawa 1991), Dynamic Bayesian Networks and Dynamic Object Oriented Bayesian Networks (Friedman, Koller, & Pfeffer 1998). We argue that Time Nets can model quantitative temporal relationships but cannot easily model the changing values of fluents, while DBNs and DOOBNs naturally model fluents, but not quantitative temporal relationships. Both capabilities are required for execution monitoring, and are supported by QTBNs.


The Author-Topic Model for Authors and Documents

arXiv.org Machine Learning

We introduce the author-topic model, a generative model for documents that extends Latent Dirichlet Allocation (LDA; Blei, Ng, & Jordan, 2003) to include authorship information. Each author is associated with a multinomial distribution over topics and each topic is associated with a multinomial distribution over words. A document with multiple authors is modeled as a distribution over topics that is a mixture of the distributions associated with the authors. We apply the model to a collection of 1,700 NIPS conference papers and 160,000 CiteSeer abstracts. Exact inference is intractable for these datasets and we use Gibbs sampling to estimate the topic and author distributions. We compare the performance with two other generative models for documents, which are special cases of the author-topic model: LDA (a topic model) and a simple author model in which each author is associated with a distribution over words rather than a distribution over topics. We show topics recovered by the author-topic model, and demonstrate applications to computing similarity between authors and entropy of author output.


Application of Variational Bayesian Approach to Speech Recognition

Neural Information Processing Systems

Application of V ariational Bayesian Approach to Speech Recognition Shinji Watanabe, Y asuhiro Minami, Atsushi Nakamura and Naonori Ueda NTT Communication Science Laboratories, NTT Corporation 2-4, Hikaridai, Seika-cho, Soraku-gun, Kyoto, Japan {watanabe,minami,ats,ueda}@cslab.kecl.ntt.co.jp Abstract In this paper, we propose a Bayesian framework, which constructs shared-state triphone HMMs based on a variational Bayesian approach, and recognizes speech based on the Bayesian prediction classification; variational Bayesian estimation and clustering for speech recognition (VBEC). An appropriate model structure with high recognition performance can be found within a VBEC framework. Unlike conventional methods, including BIC or MDL criterion based on the maximum likelihood approach, the proposed model selection is valid in principle, even when there are insufficient amounts of data, because it does not use an asymptotic assumption. In acoustic modeling, a triphone-based hidden Markov model (triphone HMM) has been widely employed. The triphone is a context dependent phoneme unit that considers both the preceding and following phonemes.


Expectation-maximization for logistic regression

arXiv.org Machine Learning

We present a family of expectation-maximization (EM) algorithms for binary and negative-binomial logistic regression, drawing a sharp connection with the variational-Bayes algorithm of Jaakkola and Jordan (2000). Indeed, our results allow a version of this variational-Bayes approach to be re-interpreted as a true EM algorithm. We study several interesting features of the algorithm, and of this previously unrecognized connection with variational Bayes. We also generalize the approach to sparsity-promoting priors, and to an online method whose convergence properties are easily established. This latter method compares favorably with stochastic-gradient descent in situations with marked collinearity.