Plotting

 Energy


Constrained Sampling with Primal-Dual Langevin Monte Carlo

Neural Information Processing Systems

This work considers the problem of sampling from a probability distribution known up to a normalization constant while satisfying a set of statistical constraints specified by the expected values of general nonlinear functions. This problem finds applications in, e.g., Bayesian inference, where it can constrain moments to evaluate counterfactual scenarios or enforce desiderata such as prediction fairness. Methods developed to handle support constraints, such as those based on mirror maps, barriers, and penalties, are not suited for this task. This work therefore relies on gradient descent-ascent dynamics in Wasserstein space to put forward a discretetime primal-dual Langevin Monte Carlo algorithm (PD-LMC) that simultaneously constrains the target distribution and samples from it. We analyze the convergence of PD-LMC under standard assumptions on the target distribution and constraints, namely (strong) convexity and log-Sobolev inequalities. To do so, we bring classical optimization arguments for saddle-point algorithms to the geometry of Wasserstein space. We illustrate the relevance and effectiveness of PD-LMC in several applications.


Generative Forests

Neural Information Processing Systems

We focus on generative AI for a type of data that still represent one of the most prevalent form of data: tabular data. Our paper introduces two key contributions: a new powerful class of forest-based models fit for such tasks and a simple training algorithm with strong convergence guarantees in a boosting model that parallels that of the original weak / strong supervised learning setting. This algorithm can be implemented by a few tweaks to the most popular induction scheme for decision tree induction (i.e.


AI Is Eating Data Center Power Demand--and It's Only Getting Worse

WIRED

AI's energy use already represents as much as 20 percent of global data-center power demand, research published Thursday in the journal Joule shows. That demand from AI, the research states, could double by the end of this year, comprising nearly half of all total data-center electricity consumption worldwide, excluding the electricity used for bitcoin mining. The new research is published in a commentary by Alex de Vries-Gao, the founder of Digiconomist, a research company that evaluates the environmental impact of technology. De Vries-Gao started Digiconomist in the late 2010s to explore the impact of bitcoin mining, another extremely energy-intensive activity, would have on the environment. Looking at AI, he says, has grown more urgent over the past few years because of the widespread adoption of ChatGPT and other large language models that use massive amounts of energy. According to his research, worldwide AI energy demand is now set to surpass demand from bitcoin mining by the end of this year.


AI could account for nearly half of datacentre power usage 'by end of year'

The Guardian

Artificial intelligence systems could account for nearly half of datacentre power consumption by the end of this year, analysis has revealed. The estimates by Alex de Vries-Gao, the founder of the Digiconomist tech sustainability website, came as the International Energy Agency forecast that AI would require almost as much energy by the end of this decade as Japan uses today. De Vries-Gao's calculations, to be published in the sustainable energy journal Joule, are based on the power consumed by chips made by Nvidia and Advanced Micro Devices that are used to train and operate AI models. The paper also takes into account the energy consumption of chips used by other companies, such as Broadcom. The IEA estimates that all data centres โ€“ excluding mining for cryptocurrencies โ€“ consumed 415 terawatt hours (TWh) of electricity last year.


Report: Creating a 5-second AI video is like running a microwave for an hour

Mashable

You've probably heard that statistic that every search on ChatGPT uses the equivalent of a bottle of water. And while that's technically true, it misses some of the nuance. The MIT Technology Review dropped a massive report that reveals how the artificial intelligence industry uses energy -- and exactly how much energy it costs to use a service like ChatGPT. The report determined that the energy cost of large-language models like ChatGPT cost anywhere from 114 joules per response to 6,706 joules per response -- that's the difference between running a microwave for one-tenth of a second to running a microwave for eight seconds. The lower-energy models, according to the report, use less energy because they uses fewer parameters, which also means the answers tend to be less accurate.


The Download: Google's AI mission, and America's reliance on natural gas

MIT Technology Review

If you want to know where AI is headed, this year's Google I/O has you covered. The company's annual showcase of next-gen products, which kicked off yesterday, has all of the pomp and pizzazz, the sizzle reels and celebrity walk-ons, that you'd expect from a multimillion dollar marketing event. But it also shows us just how fast this still-experimental technology is being subsumed into a line-up designed to sell phones and subscription tiers. Never before have I seen this thing we call artificial intelligence appear so normal. Last December, Meta announced plans to build a massive 10 billion data center for training its artificial intelligence models in rural northeast Louisiana.


Can nuclear power really fuel the rise of AI?

MIT Technology Review

This story is a part of MIT Technology Review's series "Power Hungry: AI and our energy future," on the energy demands and carbon costs of the artificial-intelligence revolution. These somewhat unlikely partnerships could be a win for both the nuclear power industry and large tech companies. Tech giants need guaranteed sources of energy, and many are looking for low-emissions ones to hit their climate goals. For nuclear plant operators and nuclear technology developers, the financial support of massive established customers could help keep old nuclear power plants open and push new technologies forward. "There [are] a lot of advantages to nuclear," says Michael Terrell, senior director of clean energy and carbon reduction at Google.


Everything you need to know about estimating AI's energy and emissions burden

MIT Technology Review

Despite the fact that billions of dollars are being poured into reshaping energy infrastructure around the needs of AI, no one has settled on a way to quantify AI's energy usage. Worse, companies are generally unwilling to disclose their own piece of the puzzle. There are also limitations to estimating the emissions associated with that energy demand, because the grid hosts a complicated, ever-changing mix of energy sources. So, that said, here are the many variables, assumptions, and caveats that we used to calculate the consequences of an AI query. Companies like OpenAI, dealing in "closed-source" models, generally offer access to their systems through an interface where you input a question and receive an answer.


AI's energy impact is still small--but how we handle it is huge

MIT Technology Review

Innovation in IT got us to this point. Graphics processing units (GPUs) that power the computing behind AI have fallen in cost by 99% since 2006. There was similar concern about the energy use of data centers in the early 2010s, with wild projections of growth in electricity demand. But gains in computing power and energy efficiency not only proved these projections wrong but enabled a 550% increase in global computing capability from 2010 to 2018 with only minimal increases in energy use. In the late 2010s, however, the trends that had saved us began to break.


We did the math on AI's energy footprint. Here's the story you haven't heard.

MIT Technology Review

AI's integration into our lives is the most significant shift in online life in more than a decade. Hundreds of millions of people now regularly turn to chatbots for help with homework, research, coding, or to create images and videos. Today, new analysis by MIT Technology Review provides an unprecedented and comprehensive look at how much energy the AI industry uses--down to a single query--to trace where its carbon footprint stands now, and where it's headed, as AI barrels towards billions of daily users. This story is a part of MIT Technology Review's series "Power Hungry: AI and our energy future," on the energy demands and carbon costs of the artificial-intelligence revolution. We spoke to two dozen experts measuring AI's energy demands, evaluated different AI models and prompts, pored over hundreds of pages of projections and reports, and questioned top AI model makers about their plans.