Asymptotic Optimism for Tensor Regression Models with Applications to Neural Network Compression
Shi, Haoming, Chi, Eric C., Luo, Hengrui
We study rank selection for low-rank tensor regression under random covariates design. Under a Gaussian random-design model and some mild conditions, we derive population expressions for the expected training-testing discrepancy (optimism) for both CP and Tucker decomposition. We further demonstrate that the optimism is minimized at the true tensor rank for both CP and Tucker regression. This yields a prediction-oriented rank-selection rule that aligns with cross-validation and extends naturally to tensor-model averaging. We also discuss conditions under which under- or over-ranked models may appear preferable, thereby clarifying the scope of the method. Finally, we showcase its practical utility on a real-world image regression task and extend its application to tensor-based compression of neural network, highlighting its potential for model selection in deep learning.
Mar-30-2026
- Country:
- Africa > Senegal
- Kolda Region > Kolda (0.04)
- North America > United States (0.04)
- Africa > Senegal
- Genre:
- Research Report > New Finding (0.46)
- Technology: