SMC Is All You Need: Parallel Strong Scaling
Liang, Xinzhu, Lohani, Sanjaya, Lukens, Joseph M., Kirby, Brian T., Searles, Thomas A., Law, Kody J. H.
–arXiv.org Artificial Intelligence
In the general framework of Bayesian inference, the target distribution can only be evaluated up-to a constant of proportionality. Classical consistent Bayesian methods such as sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC) have unbounded time complexity requirements. We develop a fully parallel sequential Monte Carlo (pSMC) method which provably delivers parallel strong scaling, i.e. the time complexity (and per-node memory) remains bounded if the number of asynchronous processes is allowed to grow. More precisely, the pSMC has a theoretical convergence rate of MSE$ = O(1/NR)$, where $N$ denotes the number of communicating samples in each processor and $R$ denotes the number of processors. In particular, for suitably-large problem-dependent $N$, as $R \rightarrow \infty$ the method converges to infinitesimal accuracy MSE$=O(\varepsilon^2)$ with a fixed finite time-complexity Cost$=O(1)$ and with no efficiency leakage, i.e. computational complexity Cost$=O(\varepsilon^{-2})$. A number of Bayesian inference problems are taken into consideration to compare the pSMC and MCMC methods.
arXiv.org Artificial Intelligence
Feb-8-2024
- Country:
- Asia > Middle East
- Jordan (0.04)
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.04)
- North America > United States
- Arizona > Maricopa County
- Tempe (0.04)
- Illinois > Cook County
- Chicago (0.04)
- Louisiana > Orleans Parish
- New Orleans (0.04)
- Maryland > Prince George's County
- Adelphi (0.04)
- Tennessee > Anderson County
- Oak Ridge (0.04)
- Arizona > Maricopa County
- Asia > Middle East
- Genre:
- Research Report (0.50)
- Industry: