This week, Microsoft and Nvidia announced that they trained what they claim is one of the largest and most capable AI language models to date: Megatron-Turing Natural Language Generation (MT-NLP). MT-NLP contains 530 billion parameters -- the parts of the model learned from historical data -- and achieves leading accuracy in a broad set of tasks, including reading comprehension and natural language inferences. But building it didn't come cheap. Experts peg the cost in the millions of dollars. Like other large AI systems, MT-NLP raises questions about the accessibility of cutting-edge research approaches in machine learning.
Oct-17-2021, 10:05:24 GMT