Lower Bounds and Nearly Optimal Algorithms in Distributed Learning with Communication Compression Xinmeng Huang 1 Yiming Chen 2,3 Wotao Yin
–Neural Information Processing Systems
Recent advances in distributed optimization and learning have shown that communication compression is one of the most effective means of reducing communication. While there have been many results for convergence rates with compressed communication, a lower bound is still missing. Analyses of algorithms with communication compression have identified two abstract properties that guarantee convergence: the unbiased property or the contrac-tive property.
Neural Information Processing Systems
Feb-9-2026, 23:13:49 GMT
- Country:
- Asia > Russia (0.04)
- Europe > Russia (0.04)
- North America > United States
- Pennsylvania (0.04)
- Genre:
- Research Report (0.46)
- Technology: