Training Normalizing Flows with the Information Bottleneck for Competitive Generative Classification

Neural Information Processing Systems 

The Information Bottleneck (IB) objective uses information theory to formulate a task-performance versus robustness trade-off. It has been successfully applied in the standard discriminative classification setting. We pose the question whether the IB can also be used to train generative likelihood models such as normalizing flows. Since normalizing flows use invertible network architectures (INNs), they are information-preserving by construction. This seems contradictory to the idea of a bottleneck.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found