to

### Measuring the Algorithmic Efficiency of Neural Networks

Three factors drive the advance of AI: algorithmic innovation, data, and the amount of compute available for training. Algorithmic progress has traditionally been more difficult to quantify than compute and data. In this work, we argue that algorithmic progress has an aspect that is both straightforward to measure and interesting: reductions over time in the compute needed to reach past capabilities. We show that the number of floating-point operations required to train a classifier to AlexNet-level performance on ImageNet has decreased by a factor of 44x between 2012 and 2019. This corresponds to algorithmic efficiency doubling every 16 months over a period of 7 years. By contrast, Moore's Law would only have yielded an 11x cost improvement. We observe that hardware and algorithmic efficiency gains multiply and can be on a similar scale over meaningful horizons, which suggests that a good model of AI progress should integrate measures from both.

### What is More Important? Productivity or Efficiency?

Is Productivity more important than Efficiency? It is another in our "Great Articles You may have missed" series… Would you rather do the same with less, or do more with the same? That's the conundrum presented in a recent Harvard Business Review (HBR) article, "Great Companies Obsess Over Productivity, Not Efficiency." I'd venture to say most of us don't really know the difference between productivity and efficiency. First, let's differentiate between productivity and efficiency.