David O. Houwen on LinkedIn: #conversational #AI #neuralnetworks


DeepMind: Why is AI so good at language? It's something in language itself ZDNet How is it that a program such as OpenAI's GPT-3 neural network can answer multiple choice questions, or write a poem in a particular style, despite never being programmed for those specific tasks? It may be because the human language has statistical properties that lead a neural network to expect the unexpected, according to new research by DeepMind, the AI unit of Google. Natural language, when viewed from the point of view of statistics, has qualities that are "non-uniform," such as words that can stand for multiple things, known as "polysemy," like the word "bank," meaning a place where you put money or a rising mound of earth. And words that sound the same can stand for different things, known as homonyms, like "here" and "hear." Those qualities of language are the focus of a paper posted on arXiv this month, "Data Distributional Properties Drive Emergent Few-Shot Learning in Transformers," by DeepMind scientists.

Duplicate Docs Excel Report

None found

Similar Docs  Excel Report  more

None found