Why distributed AI is key to pushing the AI innovation envelope
The future of AI is distributed, said Ion Stoica, co-founder, executive chairman and president of Anyscale on the first day of VB Transform. And that's because model complexity shows no signs of slowing down. "For the past couple of years, the compute requirements to train a state-of-the-art model, depending on the data set, grow between 10 times and 35 times every 18 months," he said. Just five years ago the largest models were fitting on a single GPU; fast forward to today and just to fit the parameters of the most advanced models, it takes hundreds or even thousands of GPUs. PaLM, or the Pathway Language Model from Google, has 530 billion parameters -- and that's only about half of the largest, at more than 1 trillion parameters.
Jul-23-2022, 10:30:16 GMT
- Technology: