Speeding up Heterogeneous Federated Learning with Sequentially Trained Superclients

Zaccone, Riccardo, Rizzardi, Andrea, Caldarola, Debora, Ciccone, Marco, Caputo, Barbara

arXiv.org Artificial Intelligence 

Abstract--Federated Learning (FL) allows training machine learning models in privacy-constrained scenarios by enabling the cooperation of edge devices without requiring local data sharing. This approach raises several challenges due to the different statistical distribution of the local datasets and the clients' computational heterogeneity. As a solution, we propose FedSeq, a novel framework leveraging the sequential training of subgroups of heterogeneous clients, i.e. superclients, to emulate the centralized paradigm in a privacycompliant In this work, we tackle the problems of i) non identical class distribution, meaning that for a given pair instance-label In 2017, McMahan et al. [25] introduced Federated Learning (x, y) P Inspired by the differences with the standard centralized In FL, the clients are involved in an iterative two-step process training procedure, which bounds any FL algorithm, we introduce over several communication rounds: (i) independent training Federated Learning via Sequential Superclients Training on edge devices on local datasets, and (ii) aggregation of the (FedSeq), a novel algorithm that leverages sequential training updated models into a shared global one on the server-side. This approach is usually effective in homogeneous scenarios, We simulate the presence of homogeneous and larger datasets but fails to reach comparable performance against non-i.i.d. In particular, it has been shown that the non-iidness of distributions are grouped, forming a superclient based local datasets leads to unstable and slow convergence [23], on a dissimilarity metric.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found