Pre-training Limited Memory Language Models with Internal and External Knowledge
Zhao, Linxi, Zalouk, Sofian, Belardi, Christian K., Lovelace, Justin, Zhou, Jin Peng, Noonan, Ryan Thomas, Go, Dongyoung, Weinberger, Kilian Q., Artzi, Yoav, Sun, Jennifer J.
–arXiv.org Artificial Intelligence
Neural language models are black-boxes--both linguistic patterns and factual knowledge are distributed across billions of opaque parameters. This entangled encoding makes it difficult to reliably inspect, verify, or update specific facts. We introduce Limited Memory Language Models (LMLM), a new class of language models that externalizes factual knowledge to external database during pre-training rather than memorizing them. Our pre-training approach strategically masks externally retrieved factual values from the training loss, thereby teaching the model to perform targeted lookups rather than relying on memorization in model weights. Our experiments demonstrate that LMLMs achieve competitive performance compared to significantly larger LLMs on standard benchmarks, while offering the advantages of explicit, editable, and verifiable knowledge bases.
arXiv.org Artificial Intelligence
Oct-6-2025
- Country:
- Asia
- Japan
- Honshū
- Chūgoku > Shimane Prefecture
- Matsue (0.04)
- Kansai > Kyoto Prefecture
- Kyoto (0.04)
- Kantō
- Kanagawa Prefecture > Yokohama (0.04)
- Tokyo Metropolis Prefecture > Tokyo (0.04)
- Chūgoku > Shimane Prefecture
- Kyūshū & Okinawa > Kyūshū
- Kagoshima Prefecture > Kagoshima (0.04)
- Honshū
- Middle East
- Jordan (0.04)
- Saudi Arabia > Arabian Gulf (0.04)
- South Korea > Busan
- Busan (0.04)
- Japan
- Europe
- Germany (0.04)
- Spain > Catalonia
- Barcelona Province > Barcelona (0.04)
- Indian Ocean > Arabian Gulf (0.04)
- North America
- Canada (0.04)
- Dominican Republic (0.04)
- United States
- Massachusetts (0.04)
- Minnesota > Hennepin County
- Minneapolis (0.14)
- South America
- Asia
- Genre:
- Personal (1.00)
- Research Report > New Finding (1.00)
- Industry:
- Government > Military (0.67)
- Leisure & Entertainment > Sports
- Soccer (0.68)
- Media (0.93)
- Technology: