Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning

Palenzuela, Karlo, Dadras, Ali, Yurtsever, Alp, Löfstedt, Tommy

arXiv.org Artificial Intelligence 

Multiple local steps are key to communication-efficient federated learning. However, theoretical guarantees for such algorithms, without data heterogeneity-bounding assumptions, have been lacking in general non-smooth convex problems. A typical FL algorithm consists of two main phases: local training and aggregation. Scaffold (Karimireddy et al., 2020) and Scaffnew (Mishchenko et al., 2022) stand out as notable We explore the following natural question in this work: Can multiple local steps provably reduce communication rounds in the non-smooth convex setting? Authors made an equal contribution to this work.