Distilling What We Know
The sheer size and complexity of today's generative pretrained transformer (GPT) models is nothing less than astounding. OpenAI's GPT-3, for example, possesses somewhere in the neighborhood of 175 billion parameters, and there is speculation GPT-4 could have as many as 10 trillion parameters.a All of this introduces enormous overhead in terms of required cloud resources, including compute cycles and energy consumption. At the moment, the computer power required to train state-of-the-art artificial intelligence (AI) models is rising at a rate of 15x every two years.b The cost of training a large GPT model can run into the millions of dollars.c
Aug-24-2023, 13:30:20 GMT
- Country:
- Europe > Austria (0.04)
- North America > United States
- California > Alameda County
- Berkeley (0.04)
- Oregon > Clackamas County
- West Linn (0.04)
- California > Alameda County
- Genre:
- Research Report (0.30)
- Industry:
- Information Technology (0.95)
- Technology: