Evaluated CMI Bounds for Meta Learning: Tightness and Expressiveness
–Neural Information Processing Systems
Recent work has established that the conditional mutual information (CMI) framework of Steinke and Zakynthinou (2020) is expressive enough to capture generalization guarantees in terms of algorithmic stability, VC dimension, and related complexity measures for conventional learning (Harutyunyan et al., 2021, Haghifam et al., 2021). Hence, it provides a unified method for establishing generalization bounds. In meta learning, there has so far been a divide between informationtheoretic results and results from classical learning theory. In this work, we take a first step toward bridging this divide. Specifically, we present novel generalization bounds for meta learning in terms of the evaluated CMI (e-CMI).
Neural Information Processing Systems
Mar-26-2025, 23:05:46 GMT
- Country:
- Europe (1.00)
- North America > United States (0.68)
- Genre:
- Research Report (0.68)
- Industry:
- Education (0.46)
- Technology: