Non-Vacuous Generalization Bounds: Can Rescaling Invariances Help?
Rouchouse, Damien, Gonon, Antoine, Gribonval, Rémi, Guedj, Benjamin
A central challenge in understanding generalization is to obtain non-vacuous guarantees that go beyond worst-case complexity over data or weight space. Among existing approaches, PAC-Bayes bounds stand out as they can provide tight, data-dependent guarantees even for large networks. However, in ReLU networks, rescaling invariances mean that different weight distributions can represent the same function while leading to arbitrarily different PAC-Bayes complexities. We propose to study PAC-Bayes bounds in an invariant, lifted representation that resolves this discrepancy. This paper explores both the guarantees provided by this approach (invariance, tighter bounds via data processing) and the algorithmic aspects of KL-based rescaling-invariant PAC-Bayes bounds.
Oct-1-2025
- Country:
- Asia > Middle East
- Jordan (0.04)
- Europe
- Austria > Vienna (0.14)
- France > Île-de-France
- Switzerland > Vaud
- Lausanne (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Greater London > London (0.04)
- North America
- Canada > British Columbia
- Vancouver (0.04)
- United States
- Louisiana > Orleans Parish
- New Orleans (0.04)
- Wisconsin > Dane County
- Madison (0.04)
- Louisiana > Orleans Parish
- Canada > British Columbia
- Oceania > Australia
- New South Wales > Sydney (0.04)
- Asia > Middle East
- Genre:
- Research Report (0.65)
- Technology: