On the Properties of Kullback-Leibler Divergence Between Multivariate Gaussian Distributions

Neural Information Processing Systems 

Kullback-Leibler (KL) divergence is one of the most important measures to calculate the difference between probability distributions. In this paper, we theoretically study several properties of KL divergence between multivariate Gaussian distributions.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found