Goto

Collaborating Authors

 epoch ai


The Price of Progress: Algorithmic Efficiency and the Falling Cost of AI Inference

Gundlach, Hans, Lynch, Jayson, Mertens, Matthias, Thompson, Neil

arXiv.org Artificial Intelligence

Language models have seen enormous progress on advanced benchmarks in recent years, but much of this progress has only been possible by using more costly models. Benchmarks may therefore present a warped picture of progress in practical capabilities per dollar. To remedy this, we use data from Artificial Analysis and Epoch AI to form the largest dataset of current and historical prices to run benchmarks to date. We find that the price for a given level of benchmark performance has decreased remarkably fast, around $5\times$ to $10\times$ per year, for frontier models on knowledge, reasoning, math, and software engineering benchmarks. These reductions in the cost of AI inference are due to economic forces, hardware efficiency improvements, and algorithmic efficiency improvements. Isolating out open models to control for competition effects and dividing by hardware price declines, we estimate that algorithmic efficiency progress is around $3\times$ per year. Finally, we recommend that evaluators both publicize and take into account the price of benchmarking as an essential part of measuring the real-world impact of AI.


Trends in AI Supercomputers

Pilz, Konstantin F., Sanders, James, Rahman, Robi, Heim, Lennart

arXiv.org Artificial Intelligence

Frontier AI development relies on powerful AI supercomputers, yet analysis of these systems is limited. We create a dataset of 500 AI supercomputers from 2019 to 2025 and analyze key trends in performance, power needs, hardware cost, ownership, and global distribution. We find that the computational performance of AI supercomputers has doubled every nine months, while hardware acquisition cost and power needs both doubled every year. The leading system in March 2025, xAI's Colossus, used 200,000 AI chips, had a hardware cost of \$7B, and required 300 MW of power, as much as 250,000 households. As AI supercomputers evolved from tools for science to industrial machines, companies rapidly expanded their share of total AI supercomputer performance, while the share of governments and academia diminished. Globally, the United States accounts for about 75% of total performance in our dataset, with China in second place at 15%. If the observed trends continue, the leading AI supercomputer in 2030 will achieve $2\times10^{22}$ 16-bit FLOP/s, use two million AI chips, have a hardware cost of \$200 billion, and require 9 GW of power. Our analysis provides visibility into the AI supercomputer landscape, allowing policymakers to assess key AI trends like resource needs, ownership, and national competitiveness.


Why the US Government Banned Investments in Some Chinese AI Startups

WIRED

Late last month, the US Treasury Department finalized new restrictions limiting what kinds of Chinese tech startups US venture capital firms can invest in for national security reasons. When they go into effect in January, the long-awaited measures will stop American VCs and other investors from pouring money into cutting-edge Chinese AI models. After president-elect Trump takes office a few weeks later, his administration may expand the rules and make them even tougher. While the US still leads the world in advanced AI development, the American government has grown increasingly concerned about China catching up soon. The new outbound investment restrictions are designed to work alongside other measures, such as export controls on advanced computer chips and the Committee on Foreign Investment in the United States (CFIUS), to collectively hamper--or at least slow down--the progress of Chinese AI companies.


The Researcher Trying to Glimpse the Future of AI

TIME - Tech

Imagine if the world's response to climate change relied solely on speculative predictions from pundits and CEOs, rather than the rigorous--though still imperfect--models of climate science. "Two degrees of warming will arrive soon-ish but will change the world less than we all think," one might say. "Two degrees of warming is not just around the corner. This is going to take a long time," another could counter. This is more or less the world we're in with artificial intelligence, with OpenAI CEO Sam Altman saying that AI systems that can do any task a human can will be developed in the "reasonably close-ish future," while Yann LeCun, Chief AI Scientist at Facebook, argues that human-level AI systems are "going to take a long time."


The Billion-Dollar Price Tag of Building AI

TIME - Tech

Artificial intelligence executives have big plans--and they're not cheap. In a recent interview with TIME, Dario Amodei, CEO of AI company Anthropic predicted that the cost to develop the next generation of AI systems that will be released later this year would be around 1 billion. This trend suggests that the generation after that would cost more like 10 billion. Amodei is not the only one preparing for a spending spree. Microsoft and OpenAI are reportedly planning to build a 100 billion supercomputer to build and run AI models.