Sample Compression Hypernetworks: From Generalization Bounds to Meta-Learning
Leblanc, Benjamin, Bazinet, Mathieu, D'Amours, Nathaniel, Drouin, Alexandre, Germain, Pascal
–arXiv.org Artificial Intelligence
Reconstruction functions are pivotal in sample compression theory, a framework for deriving tight generalization bounds. From a small sample of the training set (the compression set) and an optional stream of information (the message), they recover a predictor previously learned from the whole training set. While usually fixed, we propose to learn reconstruction functions. To facilitate the optimization and increase the expressiveness of the message, we derive a new sample compression generalization bound for real-valued messages. From this theoretical analysis, we then present a new hypernetwork architecture that outputs predictors with tight generalization guarantees when trained using an original meta-learning framework. The results of promising preliminary experiments are then reported.
arXiv.org Artificial Intelligence
Oct-17-2024
- Country:
- Africa > Ethiopia
- Addis Ababa > Addis Ababa (0.04)
- Europe
- Germany > North Rhine-Westphalia
- Cologne Region > Bonn (0.04)
- Portugal > Porto
- Porto (0.04)
- Spain > Catalonia
- Barcelona Province > Barcelona (0.04)
- Germany > North Rhine-Westphalia
- North America
- Canada (0.04)
- United States
- California
- San Diego County > San Diego (0.04)
- Santa Cruz County > Santa Cruz (0.14)
- Oregon > Benton County
- Corvallis (0.04)
- Virginia (0.04)
- Washington > King County
- Bellevue (0.04)
- California
- South America > Chile
- Africa > Ethiopia
- Genre:
- Research Report (0.64)
- Industry:
- Education (0.46)
- Technology: