Facebook's latest giant language AI hits computing wall at 500 Nvidia GPUs ZDNet
Facebook's giant "XLM-R" neural network is engineered to work word problems across 100 different languages, including Swahili and Urdu, but it runs up against computing constraints even using 500 of Nvidia's world-class GPUs. With a trend to bigger and bigger machine learning models, state-of-the-art artificial intelligence research continues to run up against the limits of conventional computing technology. Last week they published a report on their invention, XLM-R, a natural language model based on the wildly popular Transformer model from Google. XLM-R is engineered to be able to perform translations between one hundred different languages. It builds upon work that Conneau did earlier this year with Guillaume Lample at Facebook, the creation of the initial XLM.
Nov-17-2019, 13:07:39 GMT