Systematic Weight Evaluation for Pruning Large Language Models: Enhancing Performance and Sustainability

Islam, Ashhadul, Belhaouari, Samir Brahim, Bermak, Amine

arXiv.org Artificial Intelligence 

Impact of Compression on Model Performance: Through comprehensive experiments, the study demonstrates that moderate pruning can enhance model efficiency, but excessive compression leads to substantial performance degradation in both language and multimodal models. Sustainable AI Development: The findings emphasize the need for optimized AI models to reduce the environmental impact, addressing critical issues like carbon footprint, electricity, and water consumption associated with training and deploying large-scale AI systems. Systematic Weight Evaluation for Pruning Large Language Models: Enhancing Performance and Sustainability Ashhadul Islam a, Samir Brahim Belhaouari a, Amine Bermak a a College Of Science & Engineering, Hamad Bin Khalifa University, Education City, Doha, 34110, QatarAbstract The exponential growth of large language models (LLMs) like ChatGPT has revolutionized artificial intelligence, offering unprecedented capabilities in natural language processing. However, the extensive computational resources required for training these models have significant environmental implications, including high carbon emissions, energy consumption, and water usage. This research presents a novel approach to LLM pruning, focusing on the systematic evaluation of individual weight importance throughout the training process. By monitoring parameter evolution over time, we propose a method that effectively reduces model size without compromising performance. Extensive experiments with both a scaled-down LLM and a large multimodal model reveal that moderate pruning enhances efficiency and reduces loss, while excessive pruning drastically deteriorates model performance. These findings highlight the critical need for optimized AI models to ensure sustainable development, balancing technological advancement with environmental responsibility.