Enhancing Burmese News Classification with Kolmogorov-Arnold Network Head Fine-tuning
Aung, Thura, Kyaw, Eaint Kay Khaing, Thu, Ye Kyaw, Oo, Thazin Myint, Supnithi, Thepchai
–arXiv.org Artificial Intelligence
In low-resource languages like Burmese, classification tasks often fine-tune only the final classification layer, keeping pre-trained encoder weights frozen. While Multi-Layer Perceptrons (MLPs) are commonly used, their fixed non-linearity can limit expressiveness and increase computational cost. This work explores Kolmogorov-Arnold Networks (KANs) as alternative classification heads, evaluating Fourier-based FourierKAN, Spline-based EfficientKAN, and Grid-based FasterKAN-across diverse embeddings including TF-IDF, fastText, and multilingual transformers (mBERT, Distil-mBERT). Experimental results show that KAN-based heads are competitive with or superior to MLPs. EfficientKAN with fastText achieved the highest F1-score (0.928), while FasterKAN offered the best trade-off between speed and accuracy. On transformer embeddings, EfficientKAN matched or slightly outperformed MLPs with mBERT (0.917 F1). These findings highlight KANs as expressive, efficient alternatives to MLPs for low-resource language classification.
arXiv.org Artificial Intelligence
Nov-27-2025
- Country:
- Asia
- North America > United States (0.14)
- Pacific Ocean (0.04)
- Genre:
- Research Report > New Finding (0.49)
- Industry:
- Government (0.69)
- Technology: