Collaborating Authors


AAAI Conferences

Belief propagation over Markov random fields has been successfully used in many AI applications since it yields accurate inference results by iteratively updating messages between nodes. However, its high computation costs are a barrier to practical use. This paper presents an efficient approach to belief propagation. Our approach, Quiet, dynamically detects converged messages to skip unnecessary updates in each iteration while it theoretically guarantees to output the same results as the standard approach used to implement belief propagation. Experiments show that our approach is significantly faster than existing approaches without sacrificing inference quality.

A Differential Semantics for Jointree Algorithms

Neural Information Processing Systems

A new approach to inference in belief networks has been recently proposed, which is based on an algebraic representation of belief networks using multi-linear functions. According to this approach, the key computational question is that of representing multi-linear functions compactly, since inference reduces to a simple process of ev aluating and differentiating such functions. W e show here that mainstream inference algorithms based on jointrees are a special case of this approach in a v ery precise sense. W e use this result to prov e new properties of jointree algorithms, and then discuss some of its practical and theoretical implications.

MapReduce Lifting for Belief Propagation

AAAI Conferences

Judging by the increasing impact of machine learning on large-scale data analysis in the last decade, one can anticipate a substantial growth in diversity of the machine learning applications for "big data" over the next decade. This exciting new opportunity, however, also raises many challenges. One of them is scaling inference within and training of graphical models. Typical ways to address this scaling issue are inference by approximate message passing, stochastic gradients, and MapReduce, among others. Often, we encounter inference and training problems with symmetries and redundancies in the graph structure. It has been shown that inference and training can indeed benefit from exploiting symmetries, for example by lifting loopy belief propagation (LBP).% can be lifted. That is, a model is compressed by grouping nodes together that send and receive identical messages so that a modified LBP running on the lifted graph yields the same marginals as LBP on the original one, but often in a fraction of time. By establishing a link between lifting and radix sort, we show that lifting is MapReduce-able and thus combine the two orthogonal approaches to scaling inference, namely exploiting symmetries and employing parallel computations.

Efficient and accurate group testing via Belief Propagation: an empirical study Artificial Intelligence

The group testing problem asks for efficient pooling schemes and algorithms that allow to screen moderately large numbers of samples for rare infections. The goal is to accurately identify the infected samples while conducting the least possible number of tests. Exploring the use of techniques centred around the Belief Propagation message passing algorithm, we suggest a new test design that significantly increases the accuracy of the results. The new design comes with Belief Propagation as an efficient inference algorithm. Aiming for results on practical rather than asymptotic problem sizes, we conduct an experimental study.