On Pruning State-Space LLMs
Ghattas, Tamer, Hassid, Michael, Schwartz, Roy
–arXiv.org Artificial Intelligence
Recent work proposed state-space models (SSMs) as an efficient alternative to transformer-based LLMs. Can these models be pruned to further reduce their computation costs? We adapt several pruning methods to the SSM structure, and apply them to four SSM-based LLMs across multiple tasks. We find that such models are quite robust to some pruning methods (e.g. WANDA), while using other methods lead to fast performance degradation.
arXiv.org Artificial Intelligence
Feb-26-2025
- Country:
- Asia > Middle East
- Israel (0.14)
- Europe > Germany (0.14)
- Asia > Middle East
- Genre:
- Research Report (1.00)
- Technology: