Faster Differentially Private Convex Optimization via Second-Order Methods

Neural Information Processing Systems 

Differentially private (stochastic) gradient descent is the workhorse of DP private machine learning in both the convex and non-convex settings.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found