How to Parameterize Asymmetric Quantization Ranges for Quantization-Aware Training
You, Jaeseong, Park, Minseop, Lee, Kyunggeun, An, Seokjun, Patel, Chirag, Nage, Markus
–arXiv.org Artificial Intelligence
This paper investigates three different parameterizations of asymmetric uniform quantization for quantization-aware training: (1) scale and offset, (2) minimum and maximum, and (3) beta and gamma. We perform a comprehensive comparative analysis of these parameterizations' influence on quantization-aware training, using both controlled experiments and real-world large language models. Our particular focus is on their changing behavior in response to critical training hyperparameters, bit width and learning rate. Based on our investigation, we propose best practices to stabilize and accelerate quantization-aware training with learnable asymmetric quantization ranges. In settings with limited low-resources, such as on-device applications or in developing countries, model efficiency is critical.
arXiv.org Artificial Intelligence
Apr-25-2024
- Country:
- Africa (0.46)
- North America > United States
- Hawaii (0.14)
- Genre:
- Research Report
- Experimental Study (0.54)
- Strength High (0.54)
- Research Report
- Technology: