Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning
Poth, Clifton, Sterz, Hannah, Paul, Indraneil, Purkayastha, Sukannya, Engländer, Leon, Imhof, Timo, Vulić, Ivan, Ruder, Sebastian, Gurevych, Iryna, Pfeiffer, Jonas
–arXiv.org Artificial Intelligence
We introduce Adapters, an open-source library that unifies parameter-efficient and modular transfer learning in large language models. By integrating 10 diverse adapter methods into a unified interface, Adapters offers ease of use and flexible configuration. Our library allows researchers and practitioners to leverage adapter modularity through composition blocks, enabling the design of complex adapter setups. We demonstrate the library's efficacy by evaluating its performance against full fine-tuning on various NLP tasks. Adapters provides a powerful tool for addressing the challenges of conventional fine-tuning paradigms and promoting more efficient and modular transfer learning. The library is available via https://adapterhub.ml/adapters.
arXiv.org Artificial Intelligence
Nov-18-2023
- Country:
- Europe (1.00)
- North America > United States
- California > Los Angeles County
- Long Beach (0.14)
- Minnesota > Hennepin County
- Minneapolis (0.14)
- California > Los Angeles County
- Genre:
- Research Report (0.50)
- Industry:
- Education (0.46)
- Technology: