More Room for Language: Investigating the Effect of Retrieval on Language Models

Samuel, David, Charpentier, Lucas Georges Gabriel, Wold, Sondre

arXiv.org Artificial Intelligence 

Retrieval-augmented language models pose a promising alternative to standard language modeling. During pretraining, these models search in a corpus of documents for contextually relevant information that could aid the language modeling objective. We introduce an'ideal retrieval' methodology to study these models in a fully controllable setting. We conduct an extensive evaluation to examine how retrieval augmentation affects the behavior of the underlying language model. Among other things, we observe that these models: Figure 1: The aggregated absolute differences from i) save substantially less world knowledge in the baseline across three categories of benchmarks, the their weights, ii) are better at understanding models exhibit consistent differences for each category.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found