Message-Passing for Approximate MAP Inference with Latent Variables
–Neural Information Processing Systems
We consider a general inference setting for discrete probabilistic graphical models where we seek maximum a posteriori (MAP) estimates for a subset of the random variables (max nodes), marginalizing over the rest (sum nodes). We present a hybrid message-passing algorithm to accomplish this. The hybrid algorithm passes a mix of sum and max messages depending on the type of source node (sum or max). We derive our algorithm by showing that it falls out as the solution of a particular relaxation of a variational framework. We further show that the Expectation Maximization algorithm can be seen as an approximation to our algorithm. Experimental results on synthetic and real-world datasets, against several baselines, demonstrate the efficacy of our proposed algorithm.
Neural Information Processing Systems
Mar-15-2024, 02:42:57 GMT
- Country:
- Asia > Middle East
- Jordan (0.04)
- North America > United States
- California > San Francisco County
- San Francisco (0.14)
- Maryland (0.04)
- Utah (0.04)
- California > San Francisco County
- Asia > Middle East
- Technology:
- Information Technology > Artificial Intelligence
- Machine Learning (1.00)
- Representation & Reasoning
- Optimization (0.68)
- Search (1.00)
- Uncertainty > Bayesian Inference (0.35)
- Information Technology > Artificial Intelligence