Gauges, Loops, and Polynomials for Partition Functions of Graphical Models

Chertkov, Michael, Maximov, Yury

arXiv.org Machine Learning 

We suggest a new methodology for analysis and computations that combines the gauge transformation (GT) technique from (Chertkov, Chernyak 2006) with the technique developed in (Gurvits 2011, Anari, Gharan 2017, Straszak, Vishnoi 2017) based on the recent progress in the field of real stable polynomials. We show that GTs (while keeping PF invariant) allow representation of PF as a sum of polynomials of variables associated with edges of the graph. A special belief propagation (BP) gauge makes a single out term of the series least sensitive to variations then resulting in the loop series for PF introduced in (Chertkov, Chernyak 2006). In addition to restating the known results in the polynomial form, we also discover a new relation between the computationally tractable BP term (single out term of the loop series evaluated at the BP gauge) and the PF: sequential application of differential operators, each associated with an edge of the graph, to the BP polynomial results in the PF. Each term in the sequence corresponds to a BP polynomial of a modified GM derived by contraction of an edge. Even though complexity of computing factors in the derived GMs grow exponentially with the number of eliminated edges, polynomials associated with the new factors remain bi-stable if the original factors have this property. Moreover, we show that BP estimations for the PF do not decrease with eliminations, thus resulting overall in a new proof of the result following from a combination of (Anari, Gharan 2017) and (Straszak, Vishnoi 2017) that the BP solution of the original GM with factors correspondent to bi-stable polynomials gives a lower bound for PF.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found