Goto

Collaborating Authors

 Instructional Material


A Practitioner's Guide to Continual Multimodal Pretraining

Neural Information Processing Systems

However, practical model deployment often operates in the gap between these two limit cases, as real-world applications demand adaptation to specific subdomains, tasks or concepts -- spread over the entire, varying life cycle of a model.


Online Adaptation of Language Models with a Memory of Amortized Contexts

Neural Information Processing Systems

However, given the ever-expanding corpus of unseen documents and the large parameter space of modern LLMs, efficient adaptation is essential. To address these challenges, we propose Memory of Amortized Contexts (MAC), an efficient and effective online adaptation framework for LLMs with strong knowledge retention.



Towards General Loop Invariant Generation: A Benchmark of Programs with Memory Manipulation

Neural Information Processing Systems

We collect 312 programs from various sources, including daily programs from college homework, the international competition (SV -COMP), benchmarks from previous papers (SLING), and programs from real-world software systems (Linux Kernel, GlibC, LiteOS, and Zephyr).