Generalization Bounds For Meta-Learning: An Information-Theoretic Analysis
–Neural Information Processing Systems
We derive a novel information-theoretic analysis of the generalization property of meta-learning algorithms. As compared to previous bounds that depend on the square norms of gradients, empirical validations on both simulated data and a well-known few-shot benchmark show that our bound is orders of magnitude tighter in most conditions.
Neural Information Processing Systems
Jan-19-2025, 09:02:32 GMT
- Technology: