MIT Technology Review
The Download: the desert data center boom, and how to measure Earth's elevations
In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities nearby. Meanwhile, Microsoft has acquired more than 225 acres of undeveloped property, and Apple is expanding its existing data center just across the Truckee River from the industrial park. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert--and it's just far enough away from Nevada's communities to elude wide notice and, some fear, adequate scrutiny. This story is part of Power Hungry: AI and our energy future--our new series shining a light on the energy demands and carbon costs of the artificial intelligence revolution.
Three takeaways about AI's energy use and climate impacts
One key caveat here is that we don't know much about "closed source" models--for these, companies hold back the details of how they work. Instead, we worked with researchers who measured the energy it takes to run open-source AI models, for which the source code is publicly available. But using open-source models, it's possible to directly measure the energy used to respond to a query rather than just guess. We worked with researchers who generated text, images, and video and measured the energy required for the chips the models are based on to perform the task. Even just within the text responses, there was a pretty large range of energy needs.
The Download: Google's AI mission, and America's reliance on natural gas
If you want to know where AI is headed, this year's Google I/O has you covered. The company's annual showcase of next-gen products, which kicked off yesterday, has all of the pomp and pizzazz, the sizzle reels and celebrity walk-ons, that you'd expect from a multimillion dollar marketing event. But it also shows us just how fast this still-experimental technology is being subsumed into a line-up designed to sell phones and subscription tiers. Never before have I seen this thing we call artificial intelligence appear so normal. Last December, Meta announced plans to build a massive 10 billion data center for training its artificial intelligence models in rural northeast Louisiana.
By putting AI into everything, Google wants to make it invisible
Yes, Google's roster of consumer-facing products is the slickest on offer. The firm is bundling most of its multimodal models into its Gemini app, including the new Imagen 4 image generator and the new Veo 3 video generator. That means you can now access Google's full range of generative models via a single chatbot. It also announced Gemini Live, a feature that lets you share your phone's screen or your camera's view with the chatbot and ask it about what it can see. Those features were previously only seen in demos of Project Astra, a "universal AI assistant" that Google DeepMind is working on.
The Download: introducing the AI energy package
It's well documented that AI is a power-hungry technology. But there has been far less reporting on the extent of that hunger, how much its appetite is set to grow in the coming years, where that power will come from, and who will pay for it. For the past six months, MIT Technology Review's team of reporters and editors have worked to answer those questions. The result is an unprecedented look at the state of AI's energy and resource usage, where it is now, where it is headed in the years to come, and why we have to get it right. At the centerpiece of this package is an entirely novel line of reporting into the demands of inference--the way human beings interact with AI when we make text queries or ask AI to come up with new images or create videos.
Can nuclear power really fuel the rise of AI?
This story is a part of MIT Technology Review's series "Power Hungry: AI and our energy future," on the energy demands and carbon costs of the artificial-intelligence revolution. These somewhat unlikely partnerships could be a win for both the nuclear power industry and large tech companies. Tech giants need guaranteed sources of energy, and many are looking for low-emissions ones to hit their climate goals. For nuclear plant operators and nuclear technology developers, the financial support of massive established customers could help keep old nuclear power plants open and push new technologies forward. "There [are] a lot of advantages to nuclear," says Michael Terrell, senior director of clean energy and carbon reduction at Google.
How AI is introducing errors into courtrooms
One of Anthropic's lawyers had asked the company's AI model Claude to create a citation for a legal article, but Claude included the wrong title and author. Anthropic's attorney admitted that the mistake was not caught by anyone reviewing the document. Lastly, and perhaps most concerning, is a case unfolding in Israel. After police arrested an individual on charges of money laundering, Israeli prosecutors submitted a request asking a judge for permission to keep the individual's phone as evidence. But they cited laws that don't exist, prompting the defendant's attorney to accuse them of including AI hallucinations in their request.
Everything you need to know about estimating AI's energy and emissions burden
Despite the fact that billions of dollars are being poured into reshaping energy infrastructure around the needs of AI, no one has settled on a way to quantify AI's energy usage. Worse, companies are generally unwilling to disclose their own piece of the puzzle. There are also limitations to estimating the emissions associated with that energy demand, because the grid hosts a complicated, ever-changing mix of energy sources. So, that said, here are the many variables, assumptions, and caveats that we used to calculate the consequences of an AI query. Companies like OpenAI, dealing in "closed-source" models, generally offer access to their systems through an interface where you input a question and receive an answer.
AI's energy impact is still small--but how we handle it is huge
Innovation in IT got us to this point. Graphics processing units (GPUs) that power the computing behind AI have fallen in cost by 99% since 2006. There was similar concern about the energy use of data centers in the early 2010s, with wild projections of growth in electricity demand. But gains in computing power and energy efficiency not only proved these projections wrong but enabled a 550% increase in global computing capability from 2010 to 2018 with only minimal increases in energy use. In the late 2010s, however, the trends that had saved us began to break.
We did the math on AI's energy footprint. Here's the story you haven't heard.
AI's integration into our lives is the most significant shift in online life in more than a decade. Hundreds of millions of people now regularly turn to chatbots for help with homework, research, coding, or to create images and videos. Today, new analysis by MIT Technology Review provides an unprecedented and comprehensive look at how much energy the AI industry uses--down to a single query--to trace where its carbon footprint stands now, and where it's headed, as AI barrels towards billions of daily users. This story is a part of MIT Technology Review's series "Power Hungry: AI and our energy future," on the energy demands and carbon costs of the artificial-intelligence revolution. We spoke to two dozen experts measuring AI's energy demands, evaluated different AI models and prompts, pored over hundreds of pages of projections and reports, and questioned top AI model makers about their plans.