Learning sparse neural networks via sensitivity-driven regularization
Enzo Tartaglione, Skjalg Lepsøy, Attilio Fiandrotti, Gianluca Francini
–Neural Information Processing Systems
The ever-increasing number of parameters in deep neural networks poses challenges for memory-limited applications. Regularize-and-prune methods aim at meeting these challenges by sparsifying the network weights. In this context we quantify the output sensitivity to the parameters (i.e.
Neural Information Processing Systems
May-26-2025, 04:00:53 GMT