Online Adaptation of Language Models with a Memory of Amortized Contexts Jihoon Tack, Eric Mitchell
–Neural Information Processing Systems
Due to the rapid generation and dissemination of information, large language models (LLMs) quickly run out of date despite enormous development costs. To address the crucial need to keep models updated, online learning has emerged as a critical tool when utilizing LLMs for real-world applications. However, given the ever-expanding corpus of unseen documents and the large parameter space of modern LLMs, efficient adaptation is essential. To address these challenges, we propose Memory of Amortized Contexts (MAC), an efficient and effective online adaptation framework for LLMs with strong knowledge retention. We propose a feature extraction and memory-augmentation approach to compress and extract information from new documents into compact modulations stored in a memory bank.
Neural Information Processing Systems
Mar-27-2025, 13:36:57 GMT
- Country:
- Europe (0.14)
- Genre:
- Instructional Material (0.67)
- Research Report > Experimental Study (1.00)
- Industry:
- Education > Educational Setting > Online (0.48)
- Technology: