NVIDIA's Eos supercomputer just broke its own AI training benchmark record

Engadget 

Depending on the hardware you're using, training a large language model of any significant size can take weeks, months, even years to complete. That's no way to do business -- nobody has the electricity and time to be waiting that long. On Wednesday, NVIDIA unveiled the newest iteration of its Eos supercomputer, one powered by more than 10,000 H100 Tensor Core GPUs and capable of training a 175 billion-parameter GPT-3 model on 1 billion tokens in under four minutes. That's three times faster than the previous benchmark on the MLPerf AI industry standard, which NVIDIA set just six months ago. Eos represents an enormous amount of compute.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found