class imbalanced quantization enabling robustness
ClimbQ: Class Imbalanced Quantization Enabling Robustness on Efficient Inferences
Quantization compresses models to low bits for efficient inferences which has received increasing attentions. However, existing approaches focused on balanced datasets, while imbalanced data is pervasive in the real world. Therefore, in this study, we investigate the realistic problem, quantization on class-imbalanced data. We observe from the analytical results that quantizing imbalanced data tends to obtain a large error due to the differences between separate class distributions, which leads to a significant accuracy loss. To address this issue, we propose a novel quantization framework, Class Imbalanced Quantization (ClimbQ) that focuses on diminishing the inter-class heterogeneity for quantization error reduction. ClimbQ first scales the variance of each class distribution and then projects data through the new distributions to the same space for quantization. To guarantee the homogeneity of class variances after the ClimbQ process, we examine the quantized features and derive that the homogeneity satisfies when data size for each class is restricted (bounded). Accordingly, we design a Homogeneous Variance Loss (HomoVar Loss) which reweights the data losses of each class based on the bounded data sizes to satisfy the homogeneity of class variances. Extensive experiments on class-imbalanced and benchmark balanced datasets reveal that ClimbQ outperforms the state-of-the-art quantization techniques, especially on highly imbalanced data.
Supplementary Materials of ClimbQ: Class Imbalanced Quantization Enabling Robustness on Efficient Inferences Ting-An Chen 1,2, De-Nian Y ang 2,3 Ming-Syan Chen 1,3 1
After the determination of scaled class distributions in Sec. A.2 Appropriate class data size estimation Therefore, we propose a distribution scaling on class variances in Sec. To ensure the homogeneity of the variances, we examine it in accordance with Levene's hypothesis in Sec. Moreover, we derive from the analytical results that the homogeneity criterion is satisfied if the data size of each class is restricted. Given the definitions and the notations in Eq. (3) and its subsequent paragraphs of the The HomoV ar loss is proposed in Sec.
ClimbQ: Class Imbalanced Quantization Enabling Robustness on Efficient Inferences
Quantization compresses models to low bits for efficient inferences which has received increasing attentions. However, existing approaches focused on balanced datasets, while imbalanced data is pervasive in the real world. Therefore, in this study, we investigate the realistic problem, quantization on class-imbalanced data. We observe from the analytical results that quantizing imbalanced data tends to obtain a large error due to the differences between separate class distributions, which leads to a significant accuracy loss. To address this issue, we propose a novel quantization framework, Class Imbalanced Quantization (ClimbQ) that focuses on diminishing the inter-class heterogeneity for quantization error reduction.