Supplementary Material for PAC-Bayes Compression Bounds So Tight That They Can Explain Generalization Appendix Outline

Neural Information Processing Systems 

The appendix is organized as follows. In Appendix A, we report results for additional bounds for SVHN and ImageNet. We also report the compression size corresponding to our best bound values and compare it to the compression size obtained through standard pruning. Furthermore, in Appendix A.1 we prove why models cannot both be compressible and fit random labels. In Appendix B, we describe how optimization over hyperparameters like the intrinsic dimension impact the P AC-Bayes bound In Appendix C, we show how our P AC-Bayes bound benefit from transfer learning.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found