The Simpler The Better: An Entropy-Based Importance Metric To Reduce Neural Networks' Depth
Quétu, Victor, Liao, Zhu, Tartaglione, Enzo
–arXiv.org Artificial Intelligence
While deep neural networks are highly effective at solving complex tasks, large pre-trained models are commonly employed even to solve consistently simpler downstream tasks, which do not necessarily require a large model's complexity. Motivated by the awareness of the ever-growing AI environmental impact, we propose an efficiency strategy that leverages prior knowledge transferred by large models. Simple but effective, we propose a method relying on an Entropy-bASed Importance mEtRic (EASIER) to reduce the depth of over-parametrized deep neural networks, which alleviates their computational burden. We assess the effectiveness of our method on traditional image classification setups. Our code is available at https://github.com/VGCQ/EASIER.
arXiv.org Artificial Intelligence
Jun-5-2024
- Country:
- Europe > France > Île-de-France > Paris > Paris (0.04)
- Genre:
- Research Report (0.64)
- Technology: