Goto

Collaborating Authors

 Gautam Kamath


Private Hypothesis Selection

Neural Information Processing Systems

We provide a differentially private algorithm for hypothesis selection. Given samples from an unknown probability distribution P and a set of m probability distributions H, the goal is to output, in a ฮต-differentially private manner, a distribution from H whose total variation distance to P is comparable to that of the best such distribution (which we denote by ฮฑ).


Differentially Private Algorithms for Learning Mixtures of Separated Gaussians

Neural Information Processing Systems

Learning the parameters of Gaussian mixture models is a fundamental and widely studied problem with numerous applications. In this work, we give new algorithms for learning the parameters of a high-dimensional, well separated, Gaussian mixture model subject to the strong constraint of differential privacy. In particular, we give a differentially private analogue of the algorithm of Achlioptas and McSherry (COLT 2005). Our algorithm has two key properties not achieved by prior work: (1) The algorithm's sample complexity matches that of the corresponding non-private algorithm up to lower order terms in a wide range of parameters.


Private Hypothesis Selection

Neural Information Processing Systems

We provide a differentially private algorithm for hypothesis selection. Given samples from an unknown probability distribution P and a set of m probability distributions H, the goal is to output, in a ฮต-differentially private manner, a distribution from H whose total variation distance to P is comparable to that of the best such distribution (which we denote by ฮฑ).


Differentially Private Algorithms for Learning Mixtures of Separated Gaussians

Neural Information Processing Systems

Learning the parameters of Gaussian mixture models is a fundamental and widely studied problem with numerous applications. In this work, we give new algorithms for learning the parameters of a high-dimensional, well separated, Gaussian mixture model subject to the strong constraint of differential privacy. In particular, we give a differentially private analogue of the algorithm of Achlioptas and McSherry (COLT 2005). Our algorithm has two key properties not achieved by prior work: (1) The algorithm's sample complexity matches that of the corresponding non-private algorithm up to lower order terms in a wide range of parameters.