Goto

Collaborating Authors

 proof




Instance-Optimal Private Density Estimation in the Wasserstein Distance

Neural Information Processing Systems

Estimating the density of a distribution from samples is a fundamental problem in statistics. In many practical settings, the Wasserstein distance is an appropriate error metric for density estimation. For example, when estimating population densities in a geographic region, a small Wasserstein distance means that the estimate is able to capture roughly where the population mass is. In this work we study differentially private density estimation in the Wasserstein distance. We design and analyze instance-optimal algorithms for this problem that can adapt to easy instances.



Supplementary Materials Outline of Supplementary Materials

Neural Information Processing Systems

Proofs of results stated in the main text are provided in Appendix A. Additional experimental results, including coverage plots, are provided in Appendix B. Therefore, the proof is completed by continuous mapping. Applying Taylor's theorem using the Lagrange form of the remainder, we have that, for some random From the proof of Eq. (7), we know that, with probability tending to 1, Figure 1 shows the fitted mean and covariance on a single draw of the quadratic dataset. In this section we provide details for the experimental setup used in the paper. The visualized covariance matrices were projecting to ensure positive semi-definiteness. In this section we present the multivariate algorithm for finite-difference IDM (FDIDM).