Why Commonsense Knowledge is not (and can not be) Learned

#artificialintelligence 

Commonsense (background) knowledge, at least the kind of knowledge that we fetch and relay upon in the process of language understanding: (i) cannot be learned by processing vast amounts of text because that knowledge is never explicitly stated in the text -- and you cannot find what's not there; and (ii) that background knowledge cannot be learned perceptually from observation since the vast amount of the crucial background knowledge is universal, is not probablistic nor approximate, and so it cannot be susceptible to individual observations. The shared background knowledge needed in the process of language understanding is the kind of knowledge that obeys and respects the laws of nature and as such it has to be codified. In fact, that knowledge must be codified in a symbolic system that quantifies over variables of specific ontological types. There's a consensus among researchers investigating the neurological, psychological and evolutionary aspects of human linguistic communication that languages have evolved according to the information-theoretic principle of least effort. Specifically, it has been established that interacting communicative agents tend to produce utterances that minimize the complexity of coding a thought as well as minimize the process of decoding linguistic utterances back to the intended thought [1] -- thus finding an optimal point where the effort of both speaker and listener is minimal.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found