FedSkipTwin: Digital-Twin-Guided Client Skipping for Communication-Efficient Federated Learning
Commey, Daniel, Abbad, Kamel, Crosby, Garth V., Khoukhi, Lyes
–arXiv.org Artificial Intelligence
Communication overhead remains a primary bottleneck in federated learning (FL), particularly for applications involving mobile and IoT devices with constrained bandwidth. This work introduces FedSkipTwin, a novel client-skipping algorithm driven by lightweight, server-side digital twins. Each twin, implemented as a simple LSTM, observes a client's historical sequence of gradient norms to forecast both the magnitude and the epistemic uncertainty of its next update. The server leverages these predictions, requesting communication only when either value exceeds a predefined threshold; otherwise, it instructs the client to skip the round, thereby saving bandwidth. Experiments are conducted on the UCI-HAR and MNIST datasets with 10 clients under a non-IID data distribution. The results demonstrate that FedSkipTwin reduces total communication by 12-15.5% across 20 rounds while simultaneously improving final model accuracy by up to 0.5 percentage points compared to the standard FedAvg algorithm. These findings establish that prediction-guided skipping is a practical and effective strategy for resource-aware FL in bandwidth-constrained edge environments.
arXiv.org Artificial Intelligence
Jul-21-2025
- Country:
- Europe > France (0.05)
- North America > United States
- Texas > Brazos County > College Station (0.14)
- Genre:
- Research Report > New Finding (0.48)
- Technology: