Controlling Neural Collapse Enhances Out-of-Distribution Detection and Transfer Learning
Harun, Md Yousuf, Gallardo, Jhair, Kanan, Christopher
–arXiv.org Artificial Intelligence
Out-of-distribution (OOD) detection and OOD generalization are widely studied in Deep Neural Networks (DNNs), yet their relationship remains poorly understood. We empirically show that the degree of Neural Collapse (NC) in a network layer is inversely related with these objectives: stronger NC improves OOD detection but degrades generalization, while weaker NC enhances generalization at the cost of detection. This trade-off suggests that a single feature space cannot simultaneously achieve both tasks. To address this, we develop a theoretical framework linking NC to OOD detection and generalization. We show that entropy regularization mitigates NC to improve Figure 1: In this paper, we show that there is a close inverse generalization, while a fixed Simplex Equiangular relationship between OOD detection and generalization with Tight Frame (ETF) projector enforces NC for better respect to the degree of representation collapse in DNN detection. Based on these insights, we propose layers. This plot illustrates this relationship for VGG17 pretrained a method to control NC at different DNN layers.
arXiv.org Artificial Intelligence
Feb-15-2025