An Information-Theoretic Framework for Out-of-Distribution Generalization

Liu, Wenliang, Yu, Guanding, Wang, Lele, Liao, Renjie

arXiv.org Artificial Intelligence 

Besides recovering known results, the general framework also derives Improving the generalization ability is the core objective new generalization bounds. When evaluated in concrete of supervised learning. In the past decades, a series of mathematical examples, the new bounds can strictly outperform existing tools have been invented or applied to bound the OOD generalization bounds in some cases and recover the generalization gap, such as the VC dimension [1], Rademacher tightest existing bounds on other cases. Finally, it is worth complexity [2], covering numbers [3], algorithmic stability [4], mentioning that these generalization bounds also apply to the and PAC Bayes [5]. Recently, there have been attempts to in-distribution generalization case, by simply setting the test bound the generalization gap using information-theoretic tools.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found