Adapting to Function Difficulty and Growth Conditions in Private Optimization Hilal Asi Daniel Levy

Neural Information Processing Systems 

We develop algorithms for private stochastic convex optimization that adapt to the hardness of the specific function we wish to optimize. While previous work provide worst-case bounds for arbitrary convex functions, it is often the case that the function at hand belongs to a smaller class that enjoys faster rates. Concretely, we show that for functions exhibiting κ-growth around the optimum, i.e., f ( x) f (x