Reviews: On the Global Convergence of (Fast) Incremental Expectation Maximization Methods

Neural Information Processing Systems 

This paper has provided global convergence analyses, with convergence rates, of stochastic EM-algorithms which include incremental (iEM) and variance reduced (sEM-VR) versions of EM-algorithms. Especially, the paper has given a convergence rate of O(n/\epsilon) for iEM by applying the theory developed by Miral(2015) and a convergence rate of O(n {2/3}/\epsilon) for sEM-VR by showing sEM-VR is a special case of stochastic scaled-gradient methods. In addition, a new variance reduced EM-algorithm named fiEM based on SAGA has been proposed with its convergence analysis as well as sEM-VR. Finally, the superiority of variance reduced variants (sEM-VR and fiEM) has been shown via numerical experiments. Clarity: The paper is clear and well written.