Large Language Models

Communications of the ACM 

I can remember the days when indexing text meant compiling lists of pages on which a word appeared or finding pages in which "keywords" appeared in context. Then came full text search as exemplified by the Google search engine. Pages found in the World Wide Web are indexed word-by-word and the retrieved Web page references are rank ordered by an elaboration of the original "page rank" concept developed by the founders of Google, Larry Page and Sergey Brin. Large language models (LLMs) represent a very different way of performing information retrieval. I am no expert in this field but my cartoon model of the LLM notion follows: A statistical model of the relationship of "tokens" (words or phrases) to each other (for example, likelihood of appearing "near" each other) is built.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found