Goto

Collaborating Authors

 elliptical distribution









Energy-Tweedie: Score meets Score, Energy meets Energy

Leban, Andrej

arXiv.org Machine Learning

Denoising and score estimation have long been known to be linked via the classical Tweedie's formula. In this work, we first extend the latter to a wider range of distributions often called "energy models" and denoted elliptical distributions in this work. Next, we examine an alternative view: we consider the denoising posterior $P(X|Y)$ as the optimizer of the energy score (a scoring rule) and derive a fundamental identity that connects the (path-) derivative of a (possibly) non-Euclidean energy score to the score of the noisy marginal. This identity can be seen as an analog of Tweedie's identity for the energy score, and allows for several interesting applications; for example, score estimation, noise distribution parameter estimation, as well as using energy score models in the context of "traditional" diffusion model samplers with a wider array of noising distributions.


Elliptical Perturbations for Differential Privacy

Neural Information Processing Systems

We study elliptical distributions in locally convex vector spaces, and determine conditions when they can or cannot be used to satisfy differential privacy (DP). A requisite condition for a sanitized statistical summary to satisfy DP is that the corresponding privacy mechanism must induce equivalent probability measures for all possible input databases. We show that elliptical distributions with the same dispersion operator, $C$, are equivalent if the difference of their means lies in the Cameron-Martin space of $C$. In the case of releasing finite-dimensional summaries using elliptical perturbations, we show that the privacy parameter $\ep$ can be computed in terms of a one-dimensional maximization problem. We apply this result to consider multivariate Laplace, $t$, Gaussian, and $K$-norm noise. Surprisingly, we show that the multivariate Laplace noise does not achieve $\ep$-DP in any dimension greater than one. Finally, we show that when the dimension of the space is infinite, no elliptical distribution can be used to give $\ep$-DP; only $(\epsilon,\delta)$-DP is possible.


Generalizing Point Embeddings using the Wasserstein Space of Elliptical Distributions

Neural Information Processing Systems

Embedding complex objects as vectors in low dimensional spaces is a longstanding problem in machine learning. We propose in this work an extension of that approach, which consists in embedding objects as elliptical probability distributions, namely distributions whose densities have elliptical level sets. We endow these measures with the 2-Wasserstein metric, with two important benefits: (i) For such measures, the squared 2-Wasserstein metric has a closed form, equal to a weighted sum of the squared Euclidean distance between means and the squared Bures metric between covariance matrices. The latter is a Riemannian metric between positive semi-definite matrices, which turns out to be Euclidean on a suitable factor representation of such matrices, which is valid on the entire geodesic between these matrices.