Dual feature reduction for the sparse-group lasso and its adaptive variant
Feser, Fabio, Evangelou, Marina
The sparse-group lasso performs both variable and group selection, making simultaneous use of the strengths of the lasso and group lasso. It has found widespread use in genetics, a field that regularly involves the analysis of high-dimensional data, due to its sparse-group penalty, which allows it to utilize grouping information. However, the sparse-group lasso can be computationally more expensive than both the lasso and group lasso, due to the added shrinkage complexity, and its additional hyper-parameter that needs tuning. In this paper a novel dual feature reduction method, Dual Feature Reduction (DFR), is presented that uses strong screening rules for the sparse-group lasso and the adaptive sparse-group lasso to reduce their input space before optimization. DFR applies two layers of screening and is based on the dual norms of the sparse-group lasso and adaptive sparse-group lasso. Through synthetic and real numerical studies, it is shown that the proposed feature reduction approach is able to drastically reduce the computational cost in many different scenarios.
May-27-2024
- Country:
- Europe > United Kingdom (0.04)
- North America > United States
- California
- Alameda County > Berkeley (0.04)
- Los Angeles County > Los Angeles (0.04)
- New York (0.04)
- California
- Genre:
- Research Report (0.82)
- Industry:
- Technology: