Goto

Collaborating Authors

 Travis Dick


Data-Driven Clustering via Parameterized Lloyd's Families

Neural Information Processing Systems

Algorithms for clustering points in metric spaces is a long-studied area of research. Clustering has seen a multitude of work both theoretically, in understanding the approximation guarantees possible for many objective functions such as k-median and k-means clustering, and experimentally, in finding the fastest algorithms and seeding procedures for Lloyd's algorithm. The performance of a given clustering algorithm depends on the specific application at hand, and this may not be known up front. For example, a "typical instance" may vary depending on the application, and different clustering heuristics perform differently depending on the instance. In this paper, we define an infinite family of algorithms generalizing Lloyd's algorithm, with one parameter controlling the initialization procedure, and another parameter controlling the local search procedure. This family of algorithms includes the celebrated k-means++ algorithm, as well as the classic farthest-first traversal algorithm. We design efficient learning algorithms which receive samples from an application-specific distribution over clustering instances and learn a nearoptimal clustering algorithm from the class. We show the best parameters vary significantly across datasets such as MNIST, CIFAR, and mixtures of Gaussians. Our learned algorithms never perform worse than k-means++, and on some datasets we see significant improvements.



Differentially Private Covariance Estimation

Neural Information Processing Systems

The task of privately estimating a covariance matrix is a popular one due to its applications to regression and PCA. While there are known methods for releasing private covariance matrices, these algorithms either achive only (,)-differential privacy or require very complicated sampling schemes, ultimately performing poorly in real data. In this work we propose a new -differentially private algorithm for computing the covariance matrix of a dataset that addresses both of these limitations. We show that it has lower error than existing state-of-the-art approaches, both analytically and empirically. In addition, the algorithm is significantly less complicated than other methods and can be efficiently implemented with rejection sampling.


Differentially Private Covariance Estimation

Neural Information Processing Systems

The task of privately estimating a covariance matrix is a popular one due to its applications to regression and PCA. While there are known methods for releasing private covariance matrices, these algorithms either achive only (,)-differential privacy or require very complicated sampling schemes, ultimately performing poorly in real data. In this work we propose a new -differentially private algorithm for computing the covariance matrix of a dataset that addresses both of these limitations. We show that it has lower error than existing state-of-the-art approaches, both analytically and empirically. In addition, the algorithm is significantly less complicated than other methods and can be efficiently implemented with rejection sampling.