Greedy Low-Rank Gradient Compression for Distributed Learning with Convergence Guarantees

Chen, Chuyan, He, Yutong, Li, Pengrui, Jia, Weichen, Yuan, Kun

arXiv.org Artificial Intelligence 

Abstract--Distributed optimization is pivotal for large-scale signal processing and machine learning, yet communication overhead remains a major bottleneck. Low-rank gradient compression, in which the transmitted gradients are approximated by low-rank matrices to reduce communication, offers a promising remedy. Existing methods typically adopt either randomized or greedy compression strategies: randomized approaches project gradients onto randomly chosen subspaces, introducing high variance and degrading empirical performance; greedy methods select the most informative subspaces, achieving strong empirical results but lacking convergence guarantees. T o address this gap, we propose GreedyLore--the first Greedy L ow-R ank gradie nt compression algorithm for distributed learning with rigorous convergence guarantees. GreedyLore incorporates error feedback to correct the bias introduced by greedy compression and introduces a semi-lazy subspace update that ensures the compression operator remains contractive throughout all iterations. With these techniques, we prove that GreedyLore achieves a convergence rate of O(σ/ NT+1/T) under standard optimizers such as MSGD and Adam--marking the first linear speedup convergence rate for low-rank gradient compression. Extensive experiments are conducted to validate our theoretical findings. Index T erms--Distributed Learning, Low-Rank Compression, Communication-Efficient Optimization, Error Feedback. ISTRIBUTED optimization is a promising paradigm for addressing large-scale problems in signal processing and machine learning. In distributed algorithms, each computing node processes its local data while collaborating with others to minimize a global loss function. This approach mitigates the computational burden on individual nodes, reduces memory requirements, and enables efficient parallel computation.