Goto

Collaborating Authors

 ln 1


Two-Sided Bounds for Entropic Optimal Transport via a Rate-Distortion Integral

Liu, Jingbo

arXiv.org Machine Learning

We show that the maximum expected inner product between a random vector and the standard normal vector over all couplings subject to a mutual information constraint or regularization is equivalent to a truncated integral involving the rate-distortion function, up to universal multiplicative constants. The proof is based on a lifting technique, which constructs a Gaussian process indexed by a random subset of the type class of the probability distribution involved in the information-theoretic inequality, and then applying a form of the majorizing measure theorem.










9a6b278218966499194491f55ccf8b75-Supplemental-Conference.pdf

Neural Information Processing Systems

The unit ℓ2-spherein d-dimensions that is centered at the origin is denoted bySd 1. Additionally, given a pair of symmetric matricesA,B Rd, we write A B if and only if x (A B)x 0, x Rd. More linear algebra facts appear in AppendixE. Let V P be a subset of distributions indexed by the points in the hypercubeEd = { 1,1}d. For a number of facts from probability and statistics (both related and unrelated to exponential families),wereferthereadertoAppendixF.