The Reliability of OKRidge Method in Solving Sparse Ridge Regression Problems

Neural Information Processing Systems 

Sparse ridge regression problems play a significant role across various domains. To solve sparse ridge regression, [1] recently proposes an advanced algorithm, Scalable Optimal K-Sparse Ridge Regression (OKRidge), which is both faster and more accurate than existing approaches. However, the absence of theoretical analysis on the error of OKRidge impedes its large-scale applications. In this paper, we reframe the estimation error of OKRidge as a Primary Optimization (PO) problem and employ the Convex Gaussian min-max theorem (CGMT) to simplify the PO problem into an Auxiliary Optimization (AO) problem. Subsequently, we provide a theoretical error analysis for OKRidge based on the AO problem. This error analysis improves the theoretical reliability of OKRidge. We also conduct experiments to verify our theorems and the results are in excellent agreement with our theoretical findings.