Faster Ridge Regression via the Subsampled Randomized Hadamard Transform
Lu, Yichao, Dhillon, Paramveer, Foster, Dean P., Ungar, Lyle
–Neural Information Processing Systems
We propose a fast algorithm for ridge regression when the number of features is much larger than the number of observations ($p \gg n$). The standard way to solve ridge regression in this setting works in the dual space and gives a running time of $O(n 2p)$. Our algorithm (SRHT-DRR) runs in time $O(np\log(n))$ and works by preconditioning the design matrix by a Randomized Walsh-Hadamard Transform with a subsequent subsampling of features. We provide risk bounds for our SRHT-DRR algorithm in the fixed design setting and show experimental results on synthetic and real datasets. Papers published at the Neural Information Processing Systems Conference.
Neural Information Processing Systems
Feb-14-2020, 14:27:26 GMT
- Technology: