Kron-LoRA: Hybrid Kronecker-LoRA Adapters for Scalable, Sustainable Fine-tuning
–arXiv.org Artificial Intelligence
Fine-tuning massive pre-trained language models across many tasks demands adapters that are both parameter-efficient and expressive. We introduce \textbf{Kron-LoRA}, a hybrid adapter that combines Kronecker-structured factorization with low-rank LoRA compression-an integration that, to our knowledge, has not been explored in parameter-efficient fine-tuning or in matrix approximation literature. Kron-LoRA achieves up to 4$\times$ fewer parameters than standard LoRA while retaining similar expressivity. Experiments on DistilBERT, Mistral-7B, LLaMA-2-7B, and LLaMA-3-8B across eight benchmarks show that Kron-LoRA matches or exceeds LoRA baselines with modest memory savings and only a 5-8\% speed overhead. In sequential fine-tuning, it also delivers competitive cross-task transfer despite using only one-quarter of the adapter parameters. Kron-LoRA thus offers a scalable, sustainable solution for multi-task adaptation of large language models.
arXiv.org Artificial Intelligence
Sep-25-2025
- Country:
- Europe
- Portugal > Braga
- Braga (0.05)
- Romania > Sud - Muntenia Development Region
- Giurgiu County > Giurgiu (0.04)
- Portugal > Braga
- North America > United States
- New York > Tompkins County > Ithaca (0.04)
- Europe
- Genre:
- Research Report > New Finding (0.46)
- Technology: