Split LBI: An Iterative Regularization Path with Structural Sparsity

Chendi Huang, Xinwei Sun, Jiechao Xiong, Yuan Yao

Neural Information Processing Systems 

An iterative regularization path with structural sparsity is proposed in this paper based on variable splitting and the Linearized Bregman Iteration, hence called Split LBI. Despite its simplicity, Split LBI outperforms the popular generalized Lasso in both theory and experiments. A theory of path consistency is presented that equipped with a proper early stopping, Split LBI may achieve model selection consistency under a family of Irrepresentable Conditions which can be weaker than the necessary and sufficient condition for generalized Lasso.