Sample Compression Hypernetworks: From Generalization Bounds to Meta-Learning
Leblanc, Benjamin, Bazinet, Mathieu, D'Amours, Nathaniel, Drouin, Alexandre, Germain, Pascal
–arXiv.org Artificial Intelligence
Reconstruction functions are pivotal in sample compression theory, a framework for deriving tight generalization bounds. From a small sample of the training set (the compression set) and an optional stream of information (the message), they recover a predictor previously learned from the whole training set. While usually fixed, we propose to learn reconstruction functions. To facilitate the optimization and increase the expressiveness of the message, we derive a new sample compression generalization bound for real-valued messages. From this theoretical analysis, we then present a new hypernetwork architecture that outputs predictors with tight generalization guarantees when trained using an original meta-learning framework. The results of promising preliminary experiments are then reported.
arXiv.org Artificial Intelligence
Oct-17-2024
- Country:
- Europe (0.93)
- North America > United States
- California (0.46)
- Genre:
- Research Report (0.64)
- Industry:
- Education (0.46)
- Technology: