Plotting

 Energy


AI Is Eating Data Center Power Demand--and It's Only Getting Worse

WIRED

AI's energy use already represents as much as 20 percent of global data-center power demand, research published Thursday in the journal Joule shows. That demand from AI, the research states, could double by the end of this year, comprising nearly half of all total data-center electricity consumption worldwide, excluding the electricity used for bitcoin mining. The new research is published in a commentary by Alex de Vries-Gao, the founder of Digiconomist, a research company that evaluates the environmental impact of technology. De Vries-Gao started Digiconomist in the late 2010s to explore the impact of bitcoin mining, another extremely energy-intensive activity, would have on the environment. Looking at AI, he says, has grown more urgent over the past few years because of the widespread adoption of ChatGPT and other large language models that use massive amounts of energy. According to his research, worldwide AI energy demand is now set to surpass demand from bitcoin mining by the end of this year.


Report: Creating a 5-second AI video is like running a microwave for an hour

Mashable

You've probably heard that statistic that every search on ChatGPT uses the equivalent of a bottle of water. And while that's technically true, it misses some of the nuance. The MIT Technology Review dropped a massive report that reveals how the artificial intelligence industry uses energy -- and exactly how much energy it costs to use a service like ChatGPT. The report determined that the energy cost of large-language models like ChatGPT cost anywhere from 114 joules per response to 6,706 joules per response -- that's the difference between running a microwave for one-tenth of a second to running a microwave for eight seconds. The lower-energy models, according to the report, use less energy because they uses fewer parameters, which also means the answers tend to be less accurate.


The Download: Google's AI mission, and America's reliance on natural gas

MIT Technology Review

If you want to know where AI is headed, this year's Google I/O has you covered. The company's annual showcase of next-gen products, which kicked off yesterday, has all of the pomp and pizzazz, the sizzle reels and celebrity walk-ons, that you'd expect from a multimillion dollar marketing event. But it also shows us just how fast this still-experimental technology is being subsumed into a line-up designed to sell phones and subscription tiers. Never before have I seen this thing we call artificial intelligence appear so normal. Last December, Meta announced plans to build a massive 10 billion data center for training its artificial intelligence models in rural northeast Louisiana.


Can nuclear power really fuel the rise of AI?

MIT Technology Review

This story is a part of MIT Technology Review's series "Power Hungry: AI and our energy future," on the energy demands and carbon costs of the artificial-intelligence revolution. These somewhat unlikely partnerships could be a win for both the nuclear power industry and large tech companies. Tech giants need guaranteed sources of energy, and many are looking for low-emissions ones to hit their climate goals. For nuclear plant operators and nuclear technology developers, the financial support of massive established customers could help keep old nuclear power plants open and push new technologies forward. "There [are] a lot of advantages to nuclear," says Michael Terrell, senior director of clean energy and carbon reduction at Google.


Everything you need to know about estimating AI's energy and emissions burden

MIT Technology Review

Despite the fact that billions of dollars are being poured into reshaping energy infrastructure around the needs of AI, no one has settled on a way to quantify AI's energy usage. Worse, companies are generally unwilling to disclose their own piece of the puzzle. There are also limitations to estimating the emissions associated with that energy demand, because the grid hosts a complicated, ever-changing mix of energy sources. So, that said, here are the many variables, assumptions, and caveats that we used to calculate the consequences of an AI query. Companies like OpenAI, dealing in "closed-source" models, generally offer access to their systems through an interface where you input a question and receive an answer.


AI's energy impact is still small--but how we handle it is huge

MIT Technology Review

Innovation in IT got us to this point. Graphics processing units (GPUs) that power the computing behind AI have fallen in cost by 99% since 2006. There was similar concern about the energy use of data centers in the early 2010s, with wild projections of growth in electricity demand. But gains in computing power and energy efficiency not only proved these projections wrong but enabled a 550% increase in global computing capability from 2010 to 2018 with only minimal increases in energy use. In the late 2010s, however, the trends that had saved us began to break.


We did the math on AI's energy footprint. Here's the story you haven't heard.

MIT Technology Review

AI's integration into our lives is the most significant shift in online life in more than a decade. Hundreds of millions of people now regularly turn to chatbots for help with homework, research, coding, or to create images and videos. Today, new analysis by MIT Technology Review provides an unprecedented and comprehensive look at how much energy the AI industry uses--down to a single query--to trace where its carbon footprint stands now, and where it's headed, as AI barrels towards billions of daily users. This story is a part of MIT Technology Review's series "Power Hungry: AI and our energy future," on the energy demands and carbon costs of the artificial-intelligence revolution. We spoke to two dozen experts measuring AI's energy demands, evaluated different AI models and prompts, pored over hundreds of pages of projections and reports, and questioned top AI model makers about their plans.


Four reasons to be optimistic about AI's energy usage

MIT Technology Review

"Dollars are being invested, GPUs are being burned, water is being evaporated--it's just absolutely the wrong direction," says Ali Farhadi, CEO of the Seattle-based nonprofit Allen Institute for AI. But sift through the talk of rocketing costs--and climate impact--and you'll find reasons to be hopeful. There are innovations underway that could improve the efficiency of the software behind AI models, the computer chips those models run on, and the data centers where those chips hum around the clock. Here's what you need to know about how energy use, and therefore carbon emissions, could be cut across all three of those domains, plus an added argument for cautious optimism: There are reasons to believe that the underlying business realities will ultimately bend toward more energy-efficient AI. The most obvious place to start is with the models themselves--the way they're created and the way they're run.


OpenAI's Sam Altman thanks Sen John Fetterman for 'normalizing hoodies'

FOX News

Sen. John Fetterman, D-Pa., receives praise for his less-than-formal attire from Sam Altman during a Commerce Committee hearing. Sen. John Fetterman, D-Pa., was one of the final senators to question OpenAI chief Sam Altman during Thursday's Senate Commerce Committee hearing, and the subject of both Three Mile Island and the Democrat's penchant for Carhartt outerwear came up. Fetterman said that as a senator he has been able to meet people with "much more impressive jobs and careers" and that due to Altman's technology, "humans will have a wonderful ability to adapt." He told Altman that some Americans are worried about AI on various levels, and he asked the executive to address it. In response, Altman said he appreciated Fetterman's praise.


The Download: AI benchmarks, and Spain's grid blackout

MIT Technology Review

SWE-Bench (pronounced "swee bench") launched in November 2024 as a way to evaluate an AI model's coding skill. It has since quickly become one of the most popular tests in AI. A SWE-Bench score has become a mainstay of major model releases from OpenAI, Anthropic, and Google--and outside of foundation models, the fine-tuners at AI firms are in constant competition to see who can rise above the pack. Despite all the fervor, this isn't exactly a truthful assessment of which model is "better." Entrants have begun to game the system--which is pushing many others to wonder whether there's a better way to actually measure AI achievement.