Mitigating Persistent Client Dropout in Asynchronous Decentralized Federated Learning
Stępka, Ignacy, Gisolfi, Nicholas, Trębacz, Kacper, Dubrawski, Artur
–arXiv.org Artificial Intelligence
We consider the problem of persistent client dropout in asynchronous Decentralized Federated Learning (DFL). Asynchronicity and decentralization obfuscate information about model updates among federation peers, making recovery from a client dropout difficult. Access to the number of learning epochs, data distributions, and all the information necessary to precisely reconstruct the missing neighbor's loss functions is limited. We show that obvious mitigations do not adequately address the problem and introduce adaptive strategies based on client reconstruction. We show that these strategies can effectively recover some performance loss caused by dropout. Our work focuses on asynchronous DFL with local regularization and differs substantially from that in the existing literature. We evaluate the proposed methods on tabular and image datasets, involve three DFL algorithms, and three data heterogeneity scenarios (iid, non-iid, class-focused non-iid). Our experiments show that the proposed adaptive strategies can be effective in maintaining robustness of federated learning, even if they do not reconstruct the missing client's data precisely. We also discuss the limitations and identify future avenues for tackling the problem of client dropout.
arXiv.org Artificial Intelligence
Aug-5-2025
- Country:
- North America
- Canada > Ontario
- Toronto (0.06)
- United States
- New York > New York County
- New York City (0.04)
- Pennsylvania > Allegheny County
- Pittsburgh (0.15)
- New York > New York County
- Canada > Ontario
- North America
- Genre:
- Research Report (1.00)
- Technology: