Asymptotic Behaviors of Projected Stochastic Approximation: A Jump Diffusion Perspective

Neural Information Processing Systems 

In this paper, we consider linearly constrained stochastic approximation problems with federated learning (FL) as a special case. We propose a stochastic approximation algorithm named by LPSA with probabilistic projections to ensure feasibility so that projections are performed with probability p_n at the n -th iteration. Considering a specific family of the probability p_n and step size \eta_n, we analyze our algorithm from an asymptotic and continuous perspective. Using a novel jump diffusion approximation, we show that the trajectories consisting of properly rescaled last iterates weakly converge to the solution of specific SDEs. By analyzing the SDEs, we identify the asymptotic behaviors of LPSA for different choices of (p_n, \eta_n) .