On the Stability Analysis of Open Federated Learning Systems
Sun, Youbang, Fernando, Heshan, Chen, Tianyi, Shahrampour, Shahin
–arXiv.org Artificial Intelligence
-- We consider the open federated learning (FL) systems, where clients may join and/or leave the system during the FL process. Given the variability of the number of present clients, convergence to a fixed model cannot be guaranteed in open systems. Instead, we resort to a new performance metric that we term the stability of open FL systems, which quantifies the magnitude of the learned model in open systems. Under the assumption that local clients' functions are strongly convex and smooth, we theoretically quantify the radius of stability for two FL algorithms, namely local SGD and local Adam. We observe that this radius relies on several key parameters, including the function condition number as well as the variance of the stochastic gradient. Our theoretical results are further verified by numerical simulations on synthetic data. Federated learning (FL) [1] is a machine learning setup where a group of clients work cooperatively to learn a statistical model. The learning process is coordinated by a central server which facilitates the exchange of model updates. FL algorithms enjoy the benefits of model sharing among clients while preserving data privacy, and they also reduce the number of communications without making too much sacrifice on the performance [2]. In a canonical FL algorithm, the central server broadcasts the initial model to all clients, and then, each client performs several steps of local updates before sending the model to the server.
arXiv.org Artificial Intelligence
Mar-12-2023
- Country:
- North America > United States (0.46)
- Genre:
- Research Report > New Finding (0.46)
- Industry:
- Information Technology > Security & Privacy (0.68)
- Technology: