Goto

Collaborating Authors

Correctness of Belief Propagation in Gaussian Graphical Models of Arbitrary Topology

Neural Information Processing Systems

Local "belief propagation" rules of the sort proposed by Pearl [15] are guaranteed to converge to the correct posterior probabilities in singly connected graphical models. Recently, a number of researchers have empirically demonstratedgood performance of "loopy belief propagation" using these same rules on graphs with loops. Perhaps the most dramatic instance is the near Shannon-limit performance of "Turbo codes", whose decoding algorithm is equivalent to loopy belief propagation. Except for the case of graphs with a single loop, there has been little theoretical understandingof the performance of loopy propagation. Here we analyze belief propagation in networks with arbitrary topologies when the nodes in the graph describe jointly Gaussian random variables.


Gaussian Fields for Approximate Inference in Layered Sigmoid Belief Networks

Neural Information Processing Systems

Local "belief propagation" rules of the sort proposed by Pearl [15] are guaranteed to converge to the correct posterior probabilities in singly connected graphical models. Recently, a number of researchers have empirically demonstratedgood performance of "loopy belief propagation" using these same rules on graphs with loops. Perhaps the most dramatic instance is the near Shannon-limit performance of "Turbo codes", whose decoding algorithm is equivalent to loopy belief propagation. Except for the case of graphs with a single loop, there has been little theoretical understandingof the performance of loopy propagation. Here we analyze belief propagation in networks with arbitrary topologies when the nodes in the graph describe jointly Gaussian random variables.


Probabilistic Receiver Architecture Combining BP, MF, and EP for Multi-Signal Detection

arXiv.org Machine Learning

Receiver algorithms which combine belief propagation (BP) with the mean field (MF) approximation are well-suited for inference of both continuous and discrete random variables. In wireless scenarios involving detection of multiple signals, the standard construction of the combined BP-MF framework includes the equalization or multi-user detection functions within the MF subgraph. In this paper, we show that the MF approximation is not particularly effective for multi-signal detection. We develop a new factor graph construction for application of the BP-MF framework to problems involving the detection of multiple signals. We then develop a low-complexity variant to the proposed construction in which Gaussian BP is applied to the equalization factors. In this case, the factor graph of the joint probability distribution is divided into three subgraphs: (i) a MF subgraph comprised of the observation factors and channel estimation, (ii) a Gaussian BP subgraph which is applied to multi-signal detection, and (iii) a discrete BP subgraph which is applied to demodulation and decoding. Expectation propagation is used to approximate discrete distributions with a Gaussian distribution and links the discrete BP and Gaussian BP subgraphs. The result is a probabilistic receiver architecture with strong theoretical justification which can be applied to multi-signal detection.


Nonparanormal Belief Propagation (NPNBP)

Neural Information Processing Systems

The empirical success of the belief propagation approximate inference algorithm has inspired numerous theoretical and algorithmic advances. Yet, for continuous non-Gaussian domains performing belief propagation remains a challenging task: recent innovations such as nonparametric or kernel belief propagation, while useful, come with a substantial computational cost and offer little theoretical guarantees, even for tree structured models. In this work we present Nonparanormal BP for performing efficient inference on distributions parameterized by a Gaussian copulas network and any univariate marginals. For tree structured networks, our approach is guaranteed to be exact for this powerful class of non-Gaussian models. Importantly, the method is as efficient as standard Gaussian BP, and its convergence properties do not depend on the complexity of the univariate marginals, even when a nonparametric representation is used.


Neural Enhanced Belief Propagation on Factor Graphs

arXiv.org Machine Learning

A graphical model is a structured representation of locally dependent random variables. A traditional method to reason over these random variables is to perform inference using belief propagation. When provided with the true data generating process, belief propagation can infer the optimal posterior probability estimates in tree structured factor graphs. However, in many cases we may only have access to a poor approximation of the data generating process, or we may face loops in the factor graph, leading to suboptimal estimates. In this work we first extend graph neural networks to factor graphs (FG-GNN). We then propose a new hybrid model that runs conjointly a FG-GNN with belief propagation. The FG-GNN receives as input messages from belief propagation at every inference iteration and outputs a corrected version of them. As a result, we obtain a more accurate algorithm that combines the benefits of both belief propagation and graph neural networks. We apply our ideas to error correction decoding tasks, and we show that our algorithm can outperform belief propagation for LDPC codes on bursty channels.