Review for NeurIPS paper: On the Optimal Weighted \ell_2 Regularization in Overparameterized Linear Regression
–Neural Information Processing Systems
Weaknesses: The main issue I have with the paper is about the novelty of the results. The authors mention that previous work on linear regression is not as general as current work. In particular, they either only allow isotropic features or signal. This paper which is arXived about a month before the NeurIPS deadline seems to do both: [1] Emami, Melikasadat, et al. "Generalization error of generalized linear models in high dimensions." The results of this paper allow to characterize the exact generalization error in the same asymptotic limit for Guassian data with general covariance and any regularization, which includes the \ell_2 type regularzations considered here, as well as more general regularizations like general \ell_p norms. Here are my understanding of the differences of the results of the two papers: - In [1] the authors allow for a Gaussian feature with any covariance matrix, whereas your paper allow non-Gaussina features so long as they have bounded 12th centered-moment.
Neural Information Processing Systems
Jan-25-2025, 18:05:02 GMT
- Technology: