On the Properties of Kullback-Leibler Divergence Between Multivariate Gaussian Distributions
–Neural Information Processing Systems
Kullback-Leibler (KL) divergence is one of the most important measures to calculate the difference between probability distributions. In this paper, we theoretically study several properties of KL divergence between multivariate Gaussian distributions.
Neural Information Processing Systems
May-25-2025, 09:37:33 GMT