Nonparametric Greedy Algorithms for the Sparse Learning Problem
–Neural Information Processing Systems
This paper studies the forward greedy strategy in sparse nonparametric regression. Foradditive models, we propose an algorithm called additive forward regression; forgeneral multivariate models, we propose an algorithm called generalized forward regression. Both algorithms simultaneously conduct estimation and variable selection in nonparametric settings for the high dimensional sparse learning problem. Our main emphasis is empirical: on both simulated and real data, these two simple greedy methods can clearly outperform several state-ofthe-art competitors,including LASSO, a nonparametric version of LASSO called the sparse additive model (SpAM) and a recently proposed adaptive parametric forward-backward algorithm called Foba. We also provide some theoretical justifications ofspecific versions of the additive forward regression.
Neural Information Processing Systems
Dec-31-2009