Local SGD with Periodic Averaging: Tighter Analysis and Adaptive Synchronization
Farzin Haddadpour, Mohammad Mahdi Kamani, Mehrdad Mahdavi, Viveck Cadambe
–Neural Information Processing Systems
Communication overhead is one of the key challenges that hin ders the scalability of distributed optimization algorithms. In this paper, we s tudy local distributed SGD, where data is partitioned among computation nodes, and the computation nodes perform local updates with periodically exchanging t he model among the workers to perform averaging. While local SGD is empirically shown to provide promising results, a theoretical understanding of its performance remains open. We strengthen convergence analysis for local SGD, and show that local SGD can be far less expensive and applied far more generally t han current theory suggests.
Neural Information Processing Systems
Nov-18-2025, 15:07:20 GMT