Sufficient Conditions for Generating Group Level Sparsity in a Robust Minimax Framework

Zhou, Hongbo, Cheng, Qiang

Neural Information Processing Systems 

Regularization technique has become a principled tool for statistics and machine learning research and practice. However, in most situations, these regularization terms are not well interpreted, especially on how they are related to the loss function anddata. In this paper, we propose a robust minimax framework to interpret the relationship between data and regularization terms for a large class of loss functions. We show that various regularization terms are essentially corresponding todifferent distortions to the original data matrix. This minimax framework includes ridge regression, lasso, elastic net, fused lasso, group lasso, local coordinate coding,multiple kernel learning, etc., as special cases. Within this minimax framework, we further give mathematically exact definition for a novel representation calledsparse grouping representation (SGR), and prove a set of sufficient conditions for generating such group level sparsity. Under these sufficient conditions, alarge set of consistent regularization terms can be designed. This SGR is essentially different from group lasso in the way of using class or group information, andit outperforms group lasso when there appears group label noise. We also provide some generalization bounds in a classification setting.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found