Localized LoRA: A Structured Low-Rank Approximation for Efficient Fine-Tuning
Barazandeh, Babak, Majumdar, Subhabrata, Rajyaguru, Om, Michailidis, George
–arXiv.org Artificial Intelligence
However, most existing approaches rely on global low-rank structures, which can overlook spatial patterns spread across the parameter space. In this work, we propose Localized LoRA, a generalized framework that models weight updates as a composition of low-rank matrices applied to structured blocks of the weight matrix. This formulation enables dense, localized updates throughout the parameter space--without increasing the total number of trainable parameters. We provide a formal comparison between global, diagonal-local, and fully localized low-rank approximations, and show that our method consistently achieves lower approximation error under matched parameter budgets. Experiments on both synthetic and practical settings demonstrate that Localized LoRA offers a more expressive and adaptable alternative to existing methods, enabling efficient fine-tuning with improved performance.
arXiv.org Artificial Intelligence
Sep-25-2025
- Country:
- North America > United States > California > Los Angeles County > Los Angeles (0.14)
- Genre:
- Research Report (0.66)
- Technology: