A Distributed Cubic-Regularized Newton Method for Smooth Convex Optimization over Networks
Uribe, César A., Jadbabaie, Ali
We propose a distributed, cubic-regularized Newton method for large-scale convex optimization over networks. The proposed method requires only local computations and communications and is suitable for federated learning applications over arbitrary network topologies. We show a $O(k^{{-}3})$ convergence rate when the cost function is convex with Lipschitz gradient and Hessian, with $k$ being the number of iterations. We further provide network-dependent bounds for the communication required in each step of the algorithm. We provide numerical experiments that validate our theoretical results.
Jul-7-2020
- Country:
- Genre:
- Research Report (0.50)
- Technology: