fish mask
- North America > United States > Minnesota > Hennepin County > Minneapolis (0.14)
- North America > United States > Indiana > Hamilton County > Fishers (0.04)
- Europe > Romania > Sud - Muntenia Development Region > Giurgiu County > Giurgiu (0.04)
- North America > United States > Minnesota > Hennepin County > Minneapolis (0.14)
- North America > United States > Indiana > Hamilton County > Fishers (0.04)
- Europe > Romania > Sud - Muntenia Development Region > Giurgiu County > Giurgiu (0.04)
FISH-Tuning: Enhancing PEFT Methods with Fisher Information
Xue, Kang, Dong, Ming, Tu, Xinhui, He, Tingting
The rapid growth in the parameter size of Large Language Models (LLMs) has spurred the development of Parameter-Efficient Fine-Tuning (PEFT) methods to mitigate the substantial computational costs of fine-tuning. Among these, Fisher Induced Sparse uncHanging (FISH) Mask is a selection-based PEFT technique that identifies a critical subset of pre-trained parameters using approximate Fisher information. While addition-based and reparameterization-based PEFT methods like LoRA and Adapter already fine-tune only a small number of parameters, the newly introduced parameters within these methods themselves present an opportunity for further optimization. Selectively fine-tuning only the most impactful among these new parameters could further reduce resource consumption while maintaining, or even improving, fine-tuning effectiveness. In this paper, we propose \textbf{FISH-Tuning}, a novel approach that incorporates FISH Mask into such PEFT methods, including LoRA, Adapter, and their variants. By leveraging Fisher information to identify and update only the most significant parameters within these added or reparameterized components, FISH-Tuning aims to achieve superior performance without increasing training time or inference latency compared to the vanilla PEFT methods. Experimental results across various datasets and pre-trained models demonstrate that FISH-Tuning consistently outperforms the vanilla PEFT methods when using the same proportion of trainable parameters. Code is available at https://anonymous.4open.science/r/FISH-Tuning-6F7C.
- Europe > Romania > Sud - Muntenia Development Region > Giurgiu County > Giurgiu (0.04)
- Asia > China > Hubei Province > Wuhan (0.04)
- Research Report > New Finding (0.46)
- Research Report > Promising Solution (0.34)
Data-oriented Dynamic Fine-tuning Parameter Selection Strategy for FISH Mask based Efficient Fine-tuning
Dong, Ming, Xue, Kang, Zheng, Bolong, He, Tingting
In view of the huge number of parameters of Large language models (LLMs) , tuning all parameters is very costly, and accordingly fine-tuning specific parameters is more sensible. Most of parameter efficient fine-tuning (PEFT) concentrate on parameter selection strategies, such as additive method, selective method and reparametrization-based method. However, there are few methods that consider the impact of data samples on parameter selecting, such as Fish Mask based method. Fish Mask randomly choose a part of data samples and treat them equally during parameter selection, which is unable to dynamically select optimal parameters for inconstant data distributions. In this work, we adopt a data-oriented perspective, then proposing an IRD ($\mathrm{\underline I}$terative sample-parameter $\mathrm{\underline R}$ange $\mathrm{\underline D}$ecreasing) algorithm to search the best setting of sample-parameter pair for FISH Mask. In each iteration, by searching the set of samples and parameters with larger Fish information, IRD can find better sample-parameter pair in most scale. We demonstrate the effectiveness and rationality of proposed strategy by conducting experiments on GLUE benchmark. Experimental results show our strategy optimizes the parameter selection and achieves preferable performance.
- Europe > Romania > Sud - Muntenia Development Region > Giurgiu County > Giurgiu (0.04)
- Asia > China > Hubei Province > Wuhan (0.04)