generative grammar
Comparative Analysis of CHATGPT and the evolution of language models
Ogundare, Oluwatosin, Araya, Gustavo Quiros
Interest in Large Language Models (LLMs) has increased drastically since the emergence of ChatGPT and the outstanding positive societal response to the ease with which it performs tasks in Natural Language Processing (NLP). The triumph of ChatGPT, however, is how it seamlessly bridges the divide between language generation and knowledge models. In some cases, it provides anecdotal evidence of a framework for replicating human intuition over a knowledge domain. This paper highlights the prevailing ideas in NLP, including machine translation, machine summarization, question-answering, and language generation, and compares the performance of ChatGPT with the major algorithms in each of these categories using the Spontaneous Quality (SQ) score. A strategy for validating the arguments and results of ChatGPT is presented summarily as an example of safe, large-scale adoption of LLMs.
- Workflow (0.47)
- Research Report (0.40)
A generative grammar of cooking
Cooking is a uniquely human endeavor for transforming raw ingredients into delicious dishes. Over centuries, cultures worldwide have evolved diverse cooking practices ingrained in their culinary traditions. Recipes, thus, are cultural capsules that capture culinary knowledge in elaborate cooking protocols. While simple quantitative models have probed the patterns in recipe composition and the process of cuisine evolution, unlike other cultural quirks such as language, the principles of cooking remain hitherto unexplored. The fundamental rules that drive the act of cooking, shaping recipe composition and cuisine architecture, are unclear. Here we present a generative grammar of cooking that captures the underlying culinary logic. By studying an extensive repository of structured recipes, we identify core concepts and rules that together forge a combinatorial system for culinary synthesis. Building on the body of work done in the context of language, the demonstration of a logically consistent generative framework offers profound insights into the act of cooking. Given the central role of food in nutrition and lifestyle disorders, culinary grammar provides leverage to improve public health through dietary interventions beyond applications for creative pursuits such as novel recipe generation.
- South America > French Guiana > Guyane > Cayenne (0.04)
- Asia > Middle East > Iran > Tehran Province > Tehran (0.04)
- Asia > India > NCT > New Delhi (0.04)
- Health & Medicine > Consumer Health (1.00)
- Consumer Products & Services (0.94)
What Is Generative Grammar?
Generative grammar is a theory of human language that posits that the grammatical structure of sentences is generated by the human mind as a generative process. The theory was originally developed by Noam Chomsky in the late 1950s and 1960s. The term "generative grammar" was introduced by Chomsky in his 1965 book "Aspects of the Theory of Syntax", where he argued that his theory was a significant departure from the prevailing structuralist theories of the time, such as those of Ferdinand de Saussure and Roman Jakobson. In Chomsky's view, structuralist theories were not sufficiently explanatory. In contrast, generative grammar has a descriptive power that structuralist theories lack.
Discovering Textual Structures: Generative Grammar Induction using Template Trees
Winters, Thomas, De Raedt, Luc
Natural language generation provides designers with methods for automatically generating text, e.g. for creating summaries, chatbots and game content. In practise, text generators are often either learned and hard to interpret, or created by hand using techniques such as grammars and templates. In this paper, we introduce a novel grammar induction algorithm for learning interpretable grammars for generative purposes, called Gitta. We also introduce the novel notion of template trees to discover latent templates in corpora to derive these generative grammars. By using existing human-created grammars, we found that the algorithm can reasonably approximate these grammars using only a few examples. These results indicate that Gitta could be used to automatically learn interpretable and easily modifiable grammars, and thus provide a stepping stone for human-machine co-creation of generative models.
Generalisation of language and knowledge models for corpus analysis
This paper takes new look on language and knowledge modelling for corpus linguistics. Using ideas of Chaitin, a line of argument is made against language/knowledge separation in Natural Language Processing. A simplistic model, that generalises approaches to language and knowledge, is proposed. One of hypothetical consequences of this model is Strong AI.
Abstraction Super-structuring Normal Forms: Towards a Theory of Structural Induction
Silvescu, Adrian, Honavar, Vasant
Induction is the process by which we obtain predictive laws or theories or models of the world. We consider the structural aspect of induction. We answer the question as to whether we can find a finite and minmalistic set of operations on structural elements in terms of which any theory can be expressed. We identify abstraction (grouping similar entities) and super-structuring (combining topologically e.g., spatio-temporally close entities) as the essential structural operations in the induction process. We show that only two more structural operations, namely, reverse abstraction and reverse super-structuring (the duals of abstraction and super-structuring respectively) suffice in order to exploit the full power of Turing-equivalent generative grammars in induction. We explore the implications of this theorem with respect to the nature of hidden variables, radical positivism and the 2-century old claim of David Hume about the principles of connexion among ideas.
- North America > United States > Iowa > Story County > Ames (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Europe > Netherlands > South Holland > The Hague (0.04)
- Asia > Middle East > Jordan (0.04)
- North America > United States > California > Alameda County > Berkeley (0.04)
- Information Technology > Artificial Intelligence > Natural Language > Grammars & Parsing (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.68)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.46)
- Asia > Middle East > Jordan (0.04)
- North America > United States > California > Alameda County > Berkeley (0.04)
- Information Technology > Artificial Intelligence > Natural Language > Grammars & Parsing (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.68)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.46)
- Asia > Middle East > Jordan (0.04)
- North America > United States > California > Alameda County > Berkeley (0.04)
- Information Technology > Artificial Intelligence > Natural Language > Grammars & Parsing (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.68)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.46)