Training Normalizing Flows with the Information Bottleneck for Competitive Generative Classification
–Neural Information Processing Systems
The Information Bottleneck (IB) objective uses information theory to formulate a task-performance versus robustness trade-off. It has been successfully applied in the standard discriminative classification setting. We pose the question whether the IB can also be used to train generative likelihood models such as normalizing flows. Since normalizing flows use invertible network architectures (INNs), they are information-preserving by construction. This seems contradictory to the idea of a bottleneck.
Neural Information Processing Systems
Nov-14-2025, 01:48:31 GMT
- Country:
- Asia > Middle East
- Israel > Jerusalem District
- Jerusalem (0.04)
- Jordan (0.04)
- Israel > Jerusalem District
- Europe
- North America
- Canada > British Columbia
- Vancouver (0.04)
- United States
- California > Los Angeles County
- Long Beach (0.04)
- Louisiana > Orleans Parish
- New Orleans (0.04)
- California > Los Angeles County
- Canada > British Columbia
- Oceania > Australia
- New South Wales > Sydney (0.04)
- Asia > Middle East
- Industry:
- Education (0.46)
- Government (0.68)
- Technology: