Separating Oblivious and Adaptive Models of Variable Selection
Chen, Ziyun, Li, Jerry, Tian, Kevin, Zhu, Yusong
Sparse recovery is among the most well-studied problems in learning theory and high-dimensional statistics. In this work, we investigate the statistical and computational landscapes of sparse recovery with $\ell_\infty$ error guarantees. This variant of the problem is motivated by \emph{variable selection} tasks, where the goal is to estimate the support of a $k$-sparse signal in $\mathbb{R}^d$. Our main contribution is a provable separation between the \emph{oblivious} (``for each'') and \emph{adaptive} (``for all'') models of $\ell_\infty$ sparse recovery. We show that under an oblivious model, the optimal $\ell_\infty$ error is attainable in near-linear time with $\approx k\log d$ samples, whereas in an adaptive model, $\gtrsim k^2$ samples are necessary for any algorithm to achieve this bound. This establishes a surprising contrast with the standard $\ell_2$ setting, where $\approx k \log d$ samples suffice even for adaptive sparse recovery. We conclude with a preliminary examination of a \emph{partially-adaptive} model, where we show nontrivial variable selection guarantees are possible with $\approx k\log d$ measurements.
Feb-19-2026
- Country:
- Asia > Middle East
- Jordan (0.04)
- Europe
- Italy > Piedmont
- Turin Province > Turin (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Italy > Piedmont
- North America > United States
- Texas > Travis County > Austin (0.04)
- Asia > Middle East
- Genre:
- Research Report (0.64)
- Technology: