Profile Entropy: A Fundamental Measure for the Learnability and Compressibility of Distributions

Neural Information Processing Systems 

The profile of a sample is the multiset of its symbol frequencies. We show that for samples of discrete distributions, profile entropy is a fundamental measure unifying the concepts of estimation, inference, and compression.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found