Better Call SAUL: Fluent and Consistent Language Model Editing with Generation Regularization
Wang, Mingyang, Lange, Lukas, Adel, Heike, Strötgen, Jannik, Schütze, Hinrich
–arXiv.org Artificial Intelligence
To ensure large language models contain up-to-date knowledge, they need to be updated regularly. However, model editing is challenging as it might also affect knowledge that is unrelated to the new data. State-of-the-art methods identify parameters associated with specific knowledge and then modify them via direct weight updates. However, these locate-and-edit methods suffer from heavy computational overhead and lack theoretical validation. In contrast, directly fine-tuning the model on requested edits affects the model's behavior on unrelated knowledge, and significantly damages the model's generation fluency and consistency. To address these challenges, we propose SAUL, a streamlined model editing method that uses sentence concatenation with augmented random facts for generation regularization. Evaluations on three model editing benchmarks show that SAUL is a practical and reliable solution for model editing outperforming state-of-the-art methods while maintaining generation quality and reducing computational overhead.
arXiv.org Artificial Intelligence
Oct-3-2024
- Country:
- Asia
- Europe > Germany
- Baden-Württemberg
- Karlsruhe Region > Karlsruhe (0.04)
- Stuttgart Region > Stuttgart (0.04)
- Bavaria > Upper Bavaria
- Munich (0.04)
- Baden-Württemberg
- North America
- Canada (0.04)
- Dominican Republic (0.04)
- Genre:
- Research Report > Promising Solution (0.68)
- Industry:
- Leisure & Entertainment (0.50)
- Media > Television (0.40)
- Technology: