There's a simple way we could drastically cut AI energy use

New Scientist 

There's a simple way we could drastically cut AI energy use Being more judicious in which AI models we use for tasks could potentially save 31.9 terawatt-hours of energy this year alone - equivalent to the output of five nuclear reactors. Tiago da Silva Barros at the University of Cote d'Azur in France and his colleagues looked at 14 different tasks that people use generative AI tools for, ranging from text generation to speech recognition and image classification. 'Flashes of brilliance and frustration': I let an AI agent run my day They then examined public leaderboards, including those hosted by the machine learning hub Hugging Face, for how different models perform. The energy efficiency of the models during inference - when an AI model produces an answer - was measured by a tool called CarbonTracker, and the total energy use of that model was calculated by tracking user downloads. "Based on the size of the model, we estimated the energy consumption, and based on this, we can try to do our estimations," says da Silva Barros.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found