Sparsified-Learning for Heavy-Tailed Locally Stationary Processes

Wang, Yingjie, Alaya, Mokhtar Z., Bouzebda, Salim, Liu, Xinsheng

arXiv.org Machine Learning 

Sparsified Learning is ubiquitous in many machine learning tasks. It aims to regularize the objective function by adding a penalization term that considers the constraints made on the learned parameters. This paper considers the problem of learning heavy-tailed LSP. We develop a flexible and robust sparse learning framework capable of handling heavy-tailed data with locally stationary behavior and propose concentration inequalities. We further provide non-asymptotic oracle inequalities for different types of sparsity, including $\ell_1$-norm and total variation penalization for the least square loss.