Sparsified-Learning for Heavy-Tailed Locally Stationary Processes
Wang, Yingjie, Alaya, Mokhtar Z., Bouzebda, Salim, Liu, Xinsheng
Sparsified Learning is ubiquitous in many machine learning tasks. It aims to regularize the objective function by adding a penalization term that considers the constraints made on the learned parameters. This paper considers the problem of learning heavy-tailed LSP. We develop a flexible and robust sparse learning framework capable of handling heavy-tailed data with locally stationary behavior and propose concentration inequalities. We further provide non-asymptotic oracle inequalities for different types of sparsity, including $\ell_1$-norm and total variation penalization for the least square loss.
Apr-8-2025
- Country:
- Asia > China
- Jiangsu Province > Nanjing (0.04)
- Europe
- France (0.04)
- Germany (0.04)
- Switzerland > Basel-City
- Basel (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- North America > United States
- California > Santa Barbara County
- Santa Barbara (0.04)
- Massachusetts > Suffolk County
- Boston (0.04)
- New Jersey > Hudson County
- Hoboken (0.04)
- New York (0.04)
- California > Santa Barbara County
- Asia > China
- Genre:
- Research Report > New Finding (0.46)
- Technology: