A Residual Bootstrap for High-Dimensional Regression with Near Low-Rank Designs
–Neural Information Processing Systems
We study the residual bootstrap (RB) method in the context of high-dimensional linear regression. When regression coefficients are estimated via least squares, classical results show that RB consistently approximates the laws of contrasts, provided that $p\ll n$, where the design matrix is of size $n\times p$. Up to now, relatively little work has considered how additional structure in the linear model may extend the validity of RB to the setting where $p/n\asymp 1$. In this setting, we propose a version of RB that resamples residuals obtained from ridge regression. Our main structural assumption on the design matrix is that it is nearly low rank --- in the sense that its singular values decay according to a power-law profile.
Neural Information Processing Systems
Feb-14-2020, 12:11:29 GMT