Faster Differentially Private Samplers via Rényi Divergence Analysis of Discretized Langevin MCMC

Neural Information Processing Systems 

Various differentially private algorithms instantiate the exponential mechanism, and require sampling from the distribution \exp(-f) for a suitable function f . When the domain of the distribution is high-dimensional, this sampling can be challenging. Using heuristic sampling schemes such as Gibbs sampling does not necessarily lead to provable privacy. When f is convex, techniques from log-concave sampling lead to polynomial-time algorithms, albeit with large polynomials. Langevin dynamics-based algorithms offer much faster alternatives under some distance measures such as statistical distance.