Local SGD with Periodic Averaging: Tighter Analysis and Adaptive Synchronization
Haddadpour, Farzin, Kamani, Mohammad Mahdi, Mahdavi, Mehrdad, Cadambe, Viveck
–Neural Information Processing Systems
Communication overhead is one of the key challenges that hinders the scalability of distributed optimization algorithms. In this paper, we study local distributed SGD, where data is partitioned among computation nodes, and the computation nodes perform local updates with periodically exchanging the model among the workers to perform averaging. While local SGD is empirically shown to provide promising results, a theoretical understanding of its performance remains open. In this paper, we strengthen convergence analysis for local SGD, and show that local SGD can be far less expensive and applied far more generally than current theory suggests. Specifically, we show that for loss functions that satisfy the Polyak-Kojasiewicz condition, $O((pT) {1/3})$ rounds of communication suffice to achieve a linear speed up, that is, an error of $O(1/pT)$, where $T$ is the total number of model updates at each worker.
Neural Information Processing Systems
Mar-19-2020, 01:03:40 GMT
- Technology: