Sparse Prediction with the $k$-Support Norm
Argyriou, Andreas, Foygel, Rina, Srebro, Nathan
We derive a novel norm that corresponds to the tightest convex relaxation of sparsity combined with an $\ell_2$ penalty. We show that this new {\em $k$-support norm} provides a tighter relaxation than the elastic net and is thus a good replacement for the Lasso or the elastic net in sparse prediction problems. Through the study of the $k$-support norm, we also bound the looseness of the elastic net, thus shedding new light on it and providing justification for its use.
Jun-12-2012
- Country:
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.04)
- North America > United States
- Illinois > Cook County > Chicago (0.05)
- Europe > United Kingdom
- Genre:
- Research Report (0.64)
- Technology: