Sketched Ridgeless Linear Regression: The Role of Downsampling
Chen, Xin, Zeng, Yicheng, Yang, Siyue, Sun, Qiang
Overparametrization often helps improve the generalization performance. This paper presents a dual view of overparametrization suggesting that downsampling may also help generalize. Focusing on the proportional regime $m\asymp n \asymp p$, where $m$ represents the sketching size, $n$ is the sample size, and $p$ is the feature dimensionality, we investigate two out-of-sample prediction risks of the sketched ridgeless least square estimator. Our findings challenge conventional beliefs by showing that downsampling does not always harm generalization but can actually improve it in certain cases. We identify the optimal sketching size that minimizes out-of-sample prediction risks and demonstrate that the optimally sketched estimator exhibits stabler risk curves, eliminating the peaks of those for the full-sample estimator. To facilitate practical implementation, we propose an empirical procedure to determine the optimal sketching size. Finally, we extend our analysis to cover central limit theorems and misspecified models. Numerical studies strongly support our theory.
Oct-13-2023
- Country:
- Asia
- China
- Guangdong Province > Shenzhen (0.04)
- Hong Kong (0.04)
- Russia (0.04)
- Singapore > Central Region
- Singapore (0.04)
- China
- Europe
- Russia (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- North America
- Canada > Ontario
- Toronto (0.14)
- United States
- California > Alameda County
- Berkeley (0.04)
- New Jersey > Mercer County
- Princeton (0.04)
- New York (0.04)
- California > Alameda County
- Canada > Ontario
- Asia
- Genre:
- Research Report > New Finding (0.87)
- Technology: