Efficient Empirical Risk Minimization with Smooth Loss Functions in Non-interactive Local Differential Privacy

Wang, Di, Gaboardi, Marco, Xu, Jinhui

arXiv.org Machine Learning 

In this paper, we study the Empirical Risk Minimization problem in the non-interactive local model of differential privacy. We first show that if the ERM loss function is $(\infty, T)$-smooth, then we can avoid a dependence of the sample complexity, to achieve error $\alpha$, on the exponential of the dimensionality $p$ with base $1/\alpha$ ({\em i.e.,} $\alpha^{-p}$), which answers a question in \cite{smith2017interaction}. Our approach is based on Bernstein polynomial approximation. Then, we propose player-efficient algorithms with $1$-bit communication complexity and $O(1)$ computation cost for each player. The error bound is asymptotically the same as the original one. Also with additional assumptions we show a server efficient algorithm with polynomial running time. At last, we propose (efficient) non-interactive locally differential private algorithms, based on different types of polynomial approximations, for learning the set of k-way marginal queries and the set of smooth queries.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found