FLex&Chill: Improving Local Federated Learning Training with Logit Chilling

Lee, Kichang, Kim, Songkuk, Ko, JeongGil

arXiv.org Artificial Intelligence 

For instance, FedProx [Li et al., 2020] controls Federated learning are inherently hampered by data the number of iterations for each local device, aiming to heterogeneity: non-iid distributed training data train models resilient to challenges posed by non-independent over local clients. We propose a novel model training and non-iid data environments. SCAFFOLD [Karimireddy approach for federated learning, FLex&Chill, et al., 2020] achieves expedited convergence and improved which exploits the Logit Chilling method. Through model accuracy [McMahan et al., 2017] by introducing a extensive evaluations, we demonstrate that, in the correction term during the model aggregation phase to balance presence of non-iid data characteristics inherent in the influence of each client. These operations alleviate federated learning systems, this approach can expedite the problems posed by the non-iid environment, a common model convergence and improve inference accuracy.