Screening Rules and its Complexity for Active Set Identification

Ndiaye, Eugene, Fercoq, Olivier, Salmon, Joseph

arXiv.org Machine Learning 

In learning problems involving a large number of variables, sparse models such as Lasso and Support Vector Machines (SVM) allow to select the most important variables. For instance, the Lasso estimator depends only on a subset of features that have a maximal absolute correlation with the residual; whereas the SVM classifier depends only on a subset of sample (the support vectors) that characterize the margin. The remaining features/variables have no contribution to the optimal solution. Thus, early detection of those non influential variables may lead to significant simplifications of the problem, memory and computational resources saving. Some noticeable examples are the facial reduction preprocessing steps used for accelerating the linear programming solvers [4, 22] and conic programming [3], we refer to [6, 25] for recent reviews.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found