Fed-ensemble: Improving Generalization through Model Ensembling in Federated Learning
Shi, Naichen, Lai, Fan, Kontar, Raed Al, Chowdhury, Mosharaf
The rapid increase in computational power on edge devices has set forth federated learning (FL) as an elegant alternative to traditional cloud/data center based analytics. FL brings training to the edge, where devices collaboratively extract knowledge and learn complex models (most often deep learning models) with the orchestration of a central server while keeping their personal data stored locally. This paradigm shifts not only reduces privacy concerns but also sets forth many intrinsic advantages including cost efficiency, diversity, and reduced communication, amongst many others [39, 18]. The earliest and perhaps most popular FL algorithm is FederatedAveraging (fedavg) [28]. In fedavg, the central server broadcasts a global model (set of weights) to selected edge devices, these devices run updates based on their local data, and the server then takes a weighted average of the resulting local models to update the global model.
Jul-21-2021
- Country:
- North America > United States (0.14)
- Genre:
- Research Report > New Finding (0.46)
- Industry:
- Information Technology > Security & Privacy (0.54)
- Technology: