FLuID: Mitigating Stragglers in Federated Learning using Invariant Dropout
–Neural Information Processing Systems
Federated Learning (FL) allows machine learning models to train locally on individual mobile devices, synchronizing model updates via a shared server. This approach safeguards user privacy; however, it also generates a heterogeneous training environment due to the varying performance capabilities across devices. As a result, "straggler" devices with lower performance often dictate the overalltraining time in FL. In this work, we aim to alleviate this performance bottleneck due to stragglers by dynamically balancing the training load across the system. We introduce Invariant Dropout, a method that extracts a sub-model based on the weight update threshold, thereby minimizing potential impacts on accuracy.
Neural Information Processing Systems
Dec-27-2025, 02:29:25 GMT
- Technology: