852f50969a9e523ec41d26f2f68bd456-Paper-Conference.pdf
–Neural Information Processing Systems
Distributed learning is essential to train machine learning algorithms across heterogeneous agents while maintaining data privacy. We conduct an asymptotic analysis of Unified Distributed SGD (UD-SGD), exploring a variety of communication patterns, including decentralized SGD and local SGD within Federated Learning (FL), as well as the increasing communication interval in the FL setting. In this study, we assess how different sampling strategies, such as i.i.d.
Neural Information Processing Systems
May-30-2025, 18:25:03 GMT
- Country:
- North America > United States (0.45)
- Genre:
- Research Report > New Finding (1.00)
- Industry:
- Information Technology > Security & Privacy (0.67)
- Technology: