When it comes to the neural networks that power today's artificial intelligence, sometimes the bigger they are, the smarter they are too. Recent leaps in machine understanding of language, for example, have hinged on building some of the most enormous AI models ever and stuffing them with huge gobs of text. A new cluster of computer chips could now help these networks grow to almost unimaginable size--and show whether going ever larger may unlock further AI advances, not only in language understanding, but perhaps also in areas like robotics and computer vision. Cerebras Systems, a startup that has already built the world's largest computer chip, has now developed technology that lets a cluster of those chips run AI models that are more than a hundred times bigger than the most gargantuan ones around today. Cerebras says it can now run a neural network with 120 trillion connections, mathematical simulations of the interplay between biological neurons and synapses.
Aug-24-2021, 19:13:34 GMT