Goto

Collaborating Authors

 panaccurate


HOGWILD!-Gibbs can be PanAccurate

Neural Information Processing Systems

Asynchronous Gibbs sampling has been recently shown to be fast-mixing and an accurate method for estimating probabilities of events on a small number of variables of a graphical model satisfying Dobrushin's condition~\cite{DeSaOR16}. We investigate whether it can be used to accurately estimate expectations of functions of {\em all the variables} of the model. Under the same condition, we show that the synchronous (sequential) and asynchronous Gibbs samplers can be coupled so that the expected Hamming distance between their (multivariate) samples remains bounded by $O(\tau \log n),$ where $n$ is the number of variables in the graphical model, and $\tau$ is a measure of the asynchronicity. A similar bound holds for any constant power of the Hamming distance. Hence, the expectation of any function that is Lipschitz with respect to a power of the Hamming distance, can be estimated with a bias that grows logarithmically in $n$. Going beyond Lipschitz functions, we consider the bias arising from asynchronicity in estimating the expectation of polynomial functions of all variables in the model.


Reviews: HOGWILD!-Gibbs can be PanAccurate

Neural Information Processing Systems

The authors prove theorems about the accuracy of asynchronous Gibbs sampling in graphical models with discrete variables that satisfy Dobrushin's condition. I am not familiar with this literature, so I'm taking the authors' description of the state of the literature as a given. The authors' results are as follows (let n be the number of variables in the graphical model, let t be the time index, and let tau be the maximum expected read delay in the asynchronous sampler): - Lemma 2. The asynchronous Gibbs sampler can be coupled to a synchronous Gibbs sampler with the same initial state such that the expected Hamming distance between them is bounded by O(tau*log(n)) uniformly in t. Lemma 3 gives an analogous bound for the dth moment of the Hamming distance. If a function f is K-Lipschitz with respect to the dth power of the Hamming distance, the bias of the asynchronous Gibbs sampler for the expectation of f is bounded by log d(n) (plus a constant, times a constant, and for sufficiently large t).


HOGWILD!-Gibbs can be PanAccurate

Daskalakis, Constantinos, Dikkala, Nishanth, Jayanti, Siddhartha

Neural Information Processing Systems

Asynchronous Gibbs sampling has been recently shown to be fast-mixing and an accurate method for estimating probabilities of events on a small number of variables of a graphical model satisfying Dobrushin's condition \cite{DeSaOR16}. We investigate whether it can be used to accurately estimate expectations of functions of {\em all the variables} of the model. Under the same condition, we show that the synchronous (sequential) and asynchronous Gibbs samplers can be coupled so that the expected Hamming distance between their (multivariate) samples remains bounded by $O(\tau \log n),$ where $n$ is the number of variables in the graphical model, and $\tau$ is a measure of the asynchronicity. A similar bound holds for any constant power of the Hamming distance. Hence, the expectation of any function that is Lipschitz with respect to a power of the Hamming distance, can be estimated with a bias that grows logarithmically in $n$.