Differentially Private Optimization with Sparse Gradients
–Neural Information Processing Systems
Motivated by applications of large embedding models, we study differentially private (DP) optimization problems under sparsity of individual gradients. We start with new near-optimal bounds for the classic mean estimation problem but with sparse data, improving upon existing algorithms particularly for the highdimensional regime. The corresponding lower bounds are based on a novel blockdiagonal construction that is combined with existing DP mean estimation lower bounds. Next, we obtain pure-and approximate-DP algorithms with almost optimal rates for stochastic convex optimization with sparse gradients; the former represents the first nearly dimension-independent rates for this problem. Furthermore, by introducing novel analyses of bias reduction in mean estimation and randomly-stopped biased SGD we obtain nearly dimension-independent rates for near-stationary points for the empirical risk in nonconvex settings under approximate-DP.
Neural Information Processing Systems
May-30-2025, 01:12:56 GMT
- Genre:
- Research Report > Experimental Study (1.00)
- Industry:
- Information Technology > Security & Privacy (0.46)
- Technology: