FedNew: A Communication-Efficient and Privacy-Preserving Newton-Type Method for Federated Learning
Elgabli, Anis, Issaid, Chaouki Ben, Bedi, Amrit S., Rajawat, Ketan, Bennis, Mehdi, Aggarwal, Vaneet
While first-order methods for FL problems are extensively studied in the literature, recently a handful of second-order methods have been proposed for FL problems. The main advantage of second-order methods is their faster convergence rate, although they suffer from high communication costs. Besides that, sharing the first and second-order information (which contains client's data) may create a privacy issue. This is because the Hessian matrix contains valuable information about the properties of the local function/data, and when shared with the parameter server (PS), privacy may be violated by eavesdroppers or an honest-but-curious PS. For instance, authors in [Yin et al., 2014] show that the eigenvalues of the Hessian matrix can be used to extract important information of the input images. In this work, we are interested in developing communication efficient second-order methods while still preserving the privacy of the individual clients. To this end, we propose a novel framework that hides the gradients as well as the Hessians of the local functions, yet uses second-order information to solve the FL problem. In particular, we divide the standard Newton step into outer and inner levels. The objective of the inner level is to learn what we call the Hessian inverse-gradient product or Zeroth Hessian inverse-gradient product for the computation-efficient version (i.e., (
Jun-17-2022