HOFT: Householder Orthogonal Fine-tuning
Arcas, Alejandro Moreno, Sanchis, Albert, Civera, Jorge, Juan, Alfons
–arXiv.org Artificial Intelligence
Adaptation of foundation models using low-rank methods is a widespread approach. Another way to adapt these models is to employ orthogonal fine-tuning methods, which are less time and memory efficient despite their good generalization properties. In this work, we propose Householder Orthogonal Fine-tuning (HOFT), a novel orthogonal fine-tuning method that aims to alleviate time and space complexity. Moreover, some theoretical properties of the orthogonal fine-tuning paradigm are explored. From this exploration, Scaled Householder Orthogonal Fine-tuning (SHOFT) is proposed. Both HOFT and SHOFT are evaluated in downstream tasks, namely commonsense reasoning, machine translation, subject-driven generation and mathematical reasoning. Compared with state-of-the-art adaptation methods, HOFT and SHOFT show comparable or better results.
arXiv.org Artificial Intelligence
Sep-11-2025
- Country:
- Asia > Middle East
- UAE > Abu Dhabi Emirate > Abu Dhabi (0.04)
- Europe
- Belgium > Brussels-Capital Region
- Brussels (0.04)
- Romania > Sud - Muntenia Development Region
- Giurgiu County > Giurgiu (0.04)
- Spain (0.04)
- United Kingdom > England
- Oxfordshire > Oxford (0.14)
- Belgium > Brussels-Capital Region
- North America > United States
- Maryland > Baltimore (0.04)
- Utah > Salt Lake County
- Salt Lake City (0.04)
- Asia > Middle East
- Genre:
- Research Report > New Finding (0.93)
- Technology: