Epidemic Learning: Boosting Decentralized Learning with Randomized Communication
–Neural Information Processing Systems
We present Epidemic Learning (EL), a simple yet powerful decentralized learning (DL) algorithm that leverages changing communication topologies to achieve faster model convergence compared to conventional DL approaches. At each round of EL, each node sends its model updates to a random sample of s other nodes (in a system of n nodes). We provide an extensive theoretical analysis of EL, demonstrating that its changing topology culminates in superior convergence properties compared to the state-of-the-art (static and dynamic) topologies. Considering smooth nonconvex loss functions, the number of transient iterations for EL, i.e., the rounds required to achieve asymptotic linear speedup, is in O(
Neural Information Processing Systems
Feb-10-2025, 19:36:19 GMT
- Country:
- Europe > Denmark (0.28)
- North America (0.28)
- Genre:
- Research Report > New Finding (0.46)
- Technology: