Goto

Collaborating Authors

 energy use


'Walking sharks' lay eggs without breaking a sweat

Popular Science

Environment Animals Wildlife Fish'Walking sharks' lay eggs without breaking a sweat Breakthroughs, discoveries, and DIY tips sent six days a week. Being pregnant and giving birth is hard work for any species--but epaulette sharks () might disagree. These fish and a number of other species are known as " walking sharks " for their ability to traverse both the seafloor and land with their fins. Epaulette sharks' energy use didn't change during their reproduction cycle, as described in a study recently published in the journal . "Reproduction is the ultimate investment you are literally building new life from scratch," Jodie Rummer, a marine biologist at James Cook University and co-author of the recent study, said in a university statement .


Modular, On-Site Solutions with Lightweight Anomaly Detection for Sustainable Nutrient Management in Agriculture

Cohen, Abigail R., Sun, Yuming, Qin, Zhihao, Muriki, Harsh S., Xiao, Zihao, Lee, Yeonju, Housley, Matthew, Sharkey, Andrew F., Ferrarezi, Rhuanito S., Li, Jing, Gan, Lu, Chen, Yongsheng

arXiv.org Artificial Intelligence

Efficient nutrient management is critical for crop growth and sustainable resource consumption (e.g., nitrogen, energy). Current approaches require lengthy analyses, preventing real-time optimization; similarly, imaging facilitates rapid phenotyping but can be computationally intensive, preventing deployment under resource constraints. This study proposes a flexible, tiered pipeline for anomaly detection and status estimation (fresh weight, dry mass, and tissue nutrients), including a comprehensive energy analysis of approaches that span the efficiency-accuracy spectrum. Using a nutrient depletion experiment with three treatments (T1-100%, T2-50%, and T3-25% fertilizer strength) and multispectral imaging (MSI), we developed a hierarchical pipeline using an autoencoder (AE) for early warning. Further, we compared two status estimation modules of different complexity for more detailed analysis: vegetation index (VI) features with machine learning (Random Forest, RF) and raw whole-image deep learning (Vision Transformer, ViT). Results demonstrated high-efficiency anomaly detection (73% net detection of T3 samples 9 days after transplanting) at substantially lower energy than embodied energy in wasted nitrogen. The state estimation modules show trade-offs, with ViT outperforming RF on phosphorus and calcium estimation (R2 0.61 vs. 0.58, 0.48 vs. 0.35) at higher energy cost. With our modular pipeline, this work opens opportunities for edge diagnostics and practical opportunities for agricultural sustainability.


There's a simple way we could drastically cut AI energy use

New Scientist

There's a simple way we could drastically cut AI energy use Being more judicious in which AI models we use for tasks could potentially save 31.9 terawatt-hours of energy this year alone - equivalent to the output of five nuclear reactors. Tiago da Silva Barros at the University of Cote d'Azur in France and his colleagues looked at 14 different tasks that people use generative AI tools for, ranging from text generation to speech recognition and image classification. 'Flashes of brilliance and frustration': I let an AI agent run my day They then examined public leaderboards, including those hosted by the machine learning hub Hugging Face, for how different models perform. The energy efficiency of the models during inference - when an AI model produces an answer - was measured by a tool called CarbonTracker, and the total energy use of that model was calculated by tracking user downloads. "Based on the size of the model, we estimated the energy consumption, and based on this, we can try to do our estimations," says da Silva Barros.


Energy Use of AI Inference: Efficiency Pathways and Test-Time Compute

Oviedo, Felipe, Kazhamiaka, Fiodar, Choukse, Esha, Kim, Allen, Luers, Amy, Nakagawa, Melanie, Bianchini, Ricardo, Ferres, Juan M. Lavista

arXiv.org Artificial Intelligence

As AI inference scales to billions of queries and emerging reasoning and agentic workflows increase token demand, reliable estimates of per-query energy use are increasingly important for capacity planning, emissions accounting, and efficiency prioritization. Many public estimates are inconsistent and overstate energy use, because they extrapolate from limited benchmarks and fail to reflect efficiency gains achievable at scale. In this perspective, we introduce a bottom-up methodology to estimate the per-query energy of large-scale LLM systems based on token throughput. For models running on an H100 node under realistic workloads, GPU utilization and PUE constraints, we estimate a median energy per query of 0.34 Wh (IQR: 0.18-0.67) for frontier-scale models (>200 billion parameters). These results are consistent with measurements using production-scale configurations and show that non-production estimates and assumptions can overstate energy use by 4-20x. Extending to test-time scaling scenarios with 15x more tokens per typical query, the median energy rises 13x to 4.32 Wh, indicating that targeting efficiency in this regime will deliver the largest fleet-wide savings. We quantify achievable efficiency gains at the model, serving platform, and hardware levels, finding individual median reductions of 1.5-3.5x in energy per query, while combined advances can plausibly deliver 8-20x reductions. To illustrate the system-level impact, we estimate the baseline daily energy use of a deployment serving 1 billion queries to be 0.8 GWh/day. If 10% are long queries, demand could grow to 1.8 GWh/day. With targeted efficiency interventions, it falls to 0.9 GWh/day, similar to the energy footprint of web search at that scale. This echoes how data centers historically tempered energy growth through efficiency gains during the internet and cloud build-up.


Google's still not giving us the full picture on AI energy use

MIT Technology Review

"We're not comfortable revealing that for various reasons," Dean told me on our call. The total number is an abstract measure that changes over time, he says, adding that the company wants users to be thinking about the energy usage per prompt. But there are people out there all over the world interacting with this technology, not just me--and what we all add up to seems quite relevant. OpenAI does publicly share its total, sharing recently that it sees 2.5 billion queries to ChatGPT every day. So for the curious, we can use this as an example and take the company's self-reported average energy use per query (0.34 watt-hours) to get a rough idea of the total for all people prompting ChatGPT.


OpenAI will not disclose GPT-5's energy use. It could be higher than past models

The Guardian

In mid-2023, if a user asked OpenAI's ChatGPT for a recipe for artichoke pasta or instructions on how to make a ritual offering to the ancient Canaanite deity Moloch, its response might have taken – very roughly – 2 watt-hours, or about as much electricity as an incandescent bulb consumes in 2 minutes. OpenAI released a model on Thursday that will underpin the popular chatbot – GPT-5. Ask that version of the AI for an artichoke recipe, and the same amount of pasta-related text could take several times – even 20 times – that amount of energy, experts say. As it rolled out GPT-5, the company highlighted the model's breakthrough capabilities: its ability to create websites, answer PhD-level science questions, and reason through difficult problems. But experts who have spent the past years working to benchmark the energy and resource usage of AI models say those new powers come at a cost: a response from GPT-5 may take a significantly larger amount of energy than a response from previous versions of ChatGPT.


The Carbon Cost of Conversation, Sustainability in the Age of Language Models

Amiri, Sayed Mahbub Hasan, Goswami, Prasun, Islam, Md. Mainul, Hossen, Mohammad Shakhawat, Amiri, Sayed Majhab Hasan, Akter, Naznin

arXiv.org Artificial Intelligence

Large language models (LLMs) like GPT-3 and BERT have revolutionized natural language processing (NLP), yet their environmental costs remain dangerously overlooked. This article critiques the sustainability of LLMs, quantifying their carbon footprint, water usage, and contribution to e-waste through case studies of models such as GPT-4 and energy-efficient alternatives like Mistral 7B. Training a single LLM can emit carbon dioxide equivalent to hundreds of cars driven annually, while data centre cooling exacerbates water scarcity in vulnerable regions. Systemic challenges corporate greenwashing, redundant model development, and regulatory voids perpetuate harm, disproportionately burdening marginalized communities in the Global South. However, pathways exist for sustainable NLP: technical innovations (e.g., model pruning, quantum computing), policy reforms (carbon taxes, mandatory emissions reporting), and cultural shifts prioritizing necessity over novelty. By analysing industry leaders (Google, Microsoft) and laggards (Amazon), this work underscores the urgency of ethical accountability and global cooperation. Without immediate action, AIs ecological toll risks outpacing its societal benefits. The article concludes with a call to align technological progress with planetary boundaries, advocating for equitable, transparent, and regenerative AI systems that prioritize both human and environmental well-being.


The Hidden Costs of AI: A Review of Energy, E-Waste, and Inequality in Model Development

Winsta, Jenis

arXiv.org Artificial Intelligence

Artificial intelligence (AI) has made remarkable progress in recent years, yet its rapid expansion brings overlooked environmental and ethical challenges. This review explores four critical areas where AI's impact extends beyond performance: energy consumption, electronic waste (e-waste), inequality in compute access, and the hidden energy burden of cybersecurity systems. Drawing from recent studies and institutional reports, the paper highlights systemic issues such as high emissions from model training, rising hardware turnover, global infrastructure disparities, and the energy demands of securing AI. By connecting these concerns, the review contributes to Responsible AI discourse by identifying key research gaps and advocating for sustainable, transparent, and equitable development practices. Ultimately, it argues that AI's progress must align with ethical responsibility and environmental stewardship to ensure a more inclusive and sustainable technological future.


The Download: reasons to be optimistic about AI's energy use, and Caiwei Chen's three things

MIT Technology Review

Two weeks ago, we launched Power Hungry, a new series shining a light on the energy demands and carbon costs of the artificial intelligence revolution. It raised some worrying issues, not least the incredible energy demands of AI video generation. But there are also reasons to be hopeful: innovations that could improve the efficiency of the software behind AI models, the computer chips those models run on, and the data centers where those chips hum around the clock. Here's what you need to know about how energy use, and therefore carbon emissions, could be cut across all three of those domains, plus an added argument for cautious optimism: the underlying business realities may ultimately bend toward more energy-efficient AI. In each issue of our print magazine, we ask a member of staff to tell us about three things they're loving at the moment. For our latest edition, which was all about creativity, we asked our China reporter Caiwei Chen to give us an insight into her life.

  Country: Asia > China (0.29)
  Industry: Energy (1.00)

AI Is Eating Data Center Power Demand--and It's Only Getting Worse

WIRED

AI's energy use already represents as much as 20 percent of global data-center power demand, research published Thursday in the journal Joule shows. That demand from AI, the research states, could double by the end of this year, comprising nearly half of all total data-center electricity consumption worldwide, excluding the electricity used for bitcoin mining. The new research is published in a commentary by Alex de Vries-Gao, the founder of Digiconomist, a research company that evaluates the environmental impact of technology. De Vries-Gao started Digiconomist in the late 2010s to explore the impact of bitcoin mining, another extremely energy-intensive activity, would have on the environment. Looking at AI, he says, has grown more urgent over the past few years because of the widespread adoption of ChatGPT and other large language models that use massive amounts of energy. According to his research, worldwide AI energy demand is now set to surpass demand from bitcoin mining by the end of this year.