Can Pretrained Language Models Replace Knowledge Bases?

#artificialintelligence 

The recent rapid development of pretrained language models has produced significant performance improvements on downstream NLP tasks. These pretrained language models compile and store relational knowledge they encounter in training data, which prompted Facebook AI Research and University College London to introduce their LAMA (LAnguage Model Analysis) probe to explore the feasibility of using language models as knowledge bases. The term "knowledge base" was introduced in the 1970s. Unlike databases which store figures, tables, and other straightforward data in computer memory, a knowledge base is able to store more complex structured and unstructured information. A knowledge base system can be likened to a library that stores facts in a specific field.