Stochastic Sampling and Search in Belief Updating Algorithms Very Large Bayesian Networks

AAAI Conferences

Since belief updating in very large Bayesian networks cannot be effectively addressed by exact methods, approximate inference schemes may be often the only computationally feasible alternative. There are two basic classes of approximate schemes: stochastic sampling and search-based algorithms. We summarize the basic ideas underlying each of the classes, show how they are interrelated, discuss briefly their advantages and disadvantages, and show examples on which each of the classes fail. Finally, we study properties of a large real network from the point of view of search-based algorithms. Introduction Bayesian networks (Pearl 1988) are increasingly popular representations of problems involving reasoning under uncertainty.

Conflict and Surprise: Heuristics for Model Revision Artificial Intelligence

Any probabilistic model of a problem is based on assumptions which, if violated, invalidate the model. Users of probability based decision aids need to be alerted when cases arise that are not covered by the aid's model. Diagnosis of model failure is also necessary to control dynamic model construction and revision. This paper presents a set of decision theoretically motivated heuristics for diagnosing situations in which a model is likely to provide an inadequate representation of the process being modeled.

Using Sensitivity Analysis for Selective Parameter Update in Bayesian Network Learning

AAAI Conferences

The process of building a Bayesian network model is often a bottleneck in applying the Bayesian network approach to real-world problems. One of the daunting tasks is the quantification of the Bayesian network that often requires specifying a huge number of conditional probabilities. On the other hand, the sensitivity of the network's performance to variations in different probability parameters may be quite different; thus, certain parameters should be specified with a higher precision than the others. We present a method for a selective update of the probabilities based on the results of sensitivity analysis performed during learning a Bayesian network from data. We first perform the sensitivity analysis on a Bayesian network in order to identify the most important (most critical) probability parameters, and then further update those probabilities to more accurate values. The process is repeated until refining the probabilities any further does not improve the performance of the network. Our method can also be used in active learning of the Bayesian networks, in which case the sensitivity can be used as a criterion guiding active data selection.

Generalized Evidence Pre-propagated Importance Sampling for Hybrid Bayesian Networks

AAAI Conferences

In this paper, we first provide a new theoretical understanding of the Evidence Pre-propagated Importance Sampling algorithm (EPIS-BN) (Yuan & Druzdzel 2003; 2006b) and show that its importance function minimizes the KL-divergence between the function itself and the exact posterior probability distribution in Polytrees. We then generalize the method to deal with inference in general hybrid Bayesian networks consisting of deterministic equations and arbitrary probability distributions. Using a novel technique called soft arc reversal, the new algorithm can also handle evidential reasoning with observed deterministic variables.