Coal
Google undercounts its carbon emissions, report finds
In 2021, Google set a lofty goal of achieving net-zero carbon emissions by 2030. Yet in the years since then, the company has moved in the opposite direction as it invests in energy-intensive artificial intelligence. In its latest sustainability report, Google said its carbon emissions had increased 51% between 2019 and 2024. New research aims to debunk even that enormous figure and provide context to Google's sustainability reports, painting a bleaker picture. A report authored by non-profit advocacy group Kairos Fellowship found that, between 2019 and 2024, Google's carbon emissions actually went up by 65%.
How Much Energy Does AI Use? The People Who Know Aren't Saying
"People are often curious about how much energy a ChatGPT query uses," Sam Altman, the CEO of OpenAI, wrote in an aside in a long blog post last week. The average query, Altman wrote, uses 0.34 watt-hours of energy: "About what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes." For a company with 800 million weekly active users (and growing), the question of how much energy all these searches are using is becoming an increasingly pressing one. But experts say Altman's figure doesn't mean much without much more public context from OpenAI about how it arrived at this calculation--including the definition of what an "average" query is, whether or not it includes image generation, and whether or not Altman is including additional energy use, like from training AI models and cooling OpenAI's servers. As a result, Sasha Luccioni, the climate lead at AI company Hugging Face, doesn't put too much stock in Altman's number.
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (0.75)
The Download: power in Puerto Rico, and the pitfalls of AI agents
On the southeastern coast of Puerto Rico lies the country's only coal-fired power station, flanked by a mountain of toxic ash. The plant, owned by the utility giant AES, has long plagued this part of Puerto Rico with air and water pollution. Before the coal plant opened Guayama had on average just over 103 cancer cases per year. In 2003, the year after the plant opened, the number of cancer cases in the municipality surged by 50%, to 167. In 2022, the most recent year with available data, cases hit a new high of 209.
- Energy > Coal (0.84)
- Energy > Power Industry > Utilities (0.61)
Trump signs orders to allow coal-fired power plants to remain open
Donald Trump signed four executive orders on Tuesday aimed at reviving coal, the dirtiest fossil fuel that has long been in decline, and which substantially contributes to planet-heating greenhouse gas emissions and pollution. Environmentalists expressed dismay at the news, saying that Trump was stuck in the past and wanted to make utility customers "pay more for yesterday's energy". The US president is using emergency authority to allow some older coal-fired power plants scheduled for retirement to keep producing electricity. The move, announced at a White House event on Tuesday afternoon, was described by White House officials as being in response to increased US power demand from growth in datacenters, artificial intelligence and electric cars. Trump, standing in front of a group of miners in hard hats, said he would sign an executive order "that slashes unnecessary regulations that targeted the beautiful, clean coal".
- Materials > Metals & Mining > Coal (1.00)
- Government > Regional Government > North America Government > United States Government (1.00)
- Energy > Power Industry > Utilities (1.00)
- Energy > Coal (1.00)
Trump vows to immediately ramp up U.S. production of 'beautiful, clean coal'
President Trump this week continued to make his environmental priorities clear by vowing to open up hundreds of coal power plants in the United States in an effort to advance competition against China. "After years of being held captive by Environmental Extremists, Lunatics, Radicals, and Thugs, allowing other Countries, in particular China, to gain tremendous Economic advantage over us by opening up hundreds of all Coal Fire Power Plants, I am authorizing my Administration to immediately begin producing Energy with BEAUTIFUL, CLEAN COAL," Trump wrote in a post on social media Monday. Though the post was not linked to any particular policy plans or documents, it arrives as the White House takes aim at various environmental agencies and clean-energy initiatives. In the last week alone, the administration has announced plans to significantly roll back regulations that govern coal production and to potentially lay off up to 65% of scientists and researchers at the Environmental Protection Agency, among other actions. Coal accounts for about 16% of the country's electricity generation, according to the U.S. Energy Information Administration -- down from about 50% in 2000 as natural gas and nuclear and renewable energy have grown.
- Asia > China (0.51)
- North America > United States > California (0.06)
- Materials > Metals & Mining > Coal (1.00)
- Government > Regional Government > North America Government > United States Government (1.00)
- Energy > Power Industry (1.00)
- Energy > Coal (1.00)
Domain Consistent Industrial Decarbonisation of Global Coal Power Plants
Ashraf, Waqar Muhammad, Dua, Vivek, Debnath, Ramit
Machine learning and optimisation techniques (MLOPT) hold significant potential to accelerate the decarbonisation of industrial systems by enabling data-driven operational improvements. However, the practical application of MLOPT in industrial settings is often hindered by a lack of domain compliance and system-specific consistency, resulting in suboptimal solutions with limited real-world applicability. To address this challenge, we propose a novel human-in-the-loop (HITL) constraint-based optimisation framework that integrates domain expertise with data-driven methods, ensuring solutions are both technically sound and operationally feasible. We demonstrate the efficacy of this framework through a case study focused on enhancing the thermal efficiency and reducing the turbine heat rate of a 660 MW supercritical coal-fired power plant. By embedding domain knowledge as constraints within the optimisation process, our approach yields solutions that align with the plant's operational patterns and are seamlessly integrated into its control systems. Empirical validation confirms a mean improvement in thermal efficiency of 0.64\% and a mean reduction in turbine heat rate of 93 kJ/kWh. Scaling our analysis to 59 global coal power plants with comparable capacity and fuel type, we estimate a cumulative lifetime reduction of 156.4 million tons of carbon emissions. These results underscore the transformative potential of our HITL-MLOPT framework in delivering domain-compliant, implementable solutions for industrial decarbonisation, offering a scalable pathway to mitigate the environmental impact of coal-based power generation worldwide.
- Asia (0.94)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.14)
- Energy > Power Industry > Utilities (1.00)
- Energy > Coal (1.00)
- Materials > Metals & Mining > Coal (0.88)
CarbonChat: Large Language Model-Based Corporate Carbon Emission Analysis and Climate Knowledge Q&A System
Cao, Zhixuan, Han, Ming, Wang, Jingtao, Jia, Meng
As the impact of global climate change intensifies, corporate carbon emissions have become a focal point of global attention. In response to issues such as the lag in climate change knowledge updates within large language models, the lack of specialization and accuracy in traditional augmented generation architectures for complex problems, and the high cost and time consumption of sustainability report analysis, this paper proposes CarbonChat: Large Language Model-based corporate carbon emission analysis and climate knowledge Q&A system, aimed at achieving precise carbon emission analysis and policy understanding.First, a diversified index module construction method is proposed to handle the segmentation of rule-based and long-text documents, as well as the extraction of structured data, thereby optimizing the parsing of key information.Second, an enhanced self-prompt retrieval-augmented generation architecture is designed, integrating intent recognition, structured reasoning chains, hybrid retrieval, and Text2SQL, improving the efficiency of semantic understanding and query conversion.Next, based on the greenhouse gas accounting framework, 14 dimensions are established for carbon emission analysis, enabling report summarization, relevance evaluation, and customized responses.Finally, through a multi-layer chunking mechanism, timestamps, and hallucination detection features, the accuracy and verifiability of the analysis results are ensured, reducing hallucination rates and enhancing the precision of the responses.
- Information Technology > Artificial Intelligence > Natural Language > Question Answering (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (1.00)
Engineering Carbon Credits Towards A Responsible FinTech Era: The Practices, Implications, and Future
Zeng, Qingwen, Xu, Hanlin, Xu, Nanjun, Salim, Flora, Gao, Junbin, Chen, Huaming
Carbon emissions significantly contribute to climate change, and carbon credits have emerged as a key tool for mitigating environmental damage and helping organizations manage their carbon footprint. Despite their growing importance across sectors, fully leveraging carbon credits remains challenging. This study explores engineering practices and fintech solutions to enhance carbon emission management. We first review the negative impacts of carbon emission non-disclosure, revealing its adverse effects on financial stability and market value. Organizations are encouraged to actively manage emissions and disclose relevant data to mitigate risks. Next, we analyze factors influencing carbon prices and review advanced prediction algorithms that optimize carbon credit purchasing strategies, reducing costs and improving efficiency. Additionally, we examine corporate carbon emission prediction models, which offer accurate performance assessments and aid in planning future carbon credit needs. By integrating carbon price and emission predictions, we propose research directions, including corporate carbon management cost forecasting. This study provides a foundation for future quantitative research on the financial and market impacts of carbon management practices and is the first systematic review focusing on computing solutions and engineering practices for carbon credits.
- North America (0.93)
- Asia > China (0.68)
- Oceania > Australia (0.46)
- Europe > United Kingdom (0.46)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (1.00)
- Overview (1.00)
- Law > Environmental Law (1.00)
- Government (1.00)
- Energy > Oil & Gas (1.00)
- (2 more...)
CEGI: Measuring the trade-off between efficiency and carbon emissions for SLMs and VLMs
Kumar, Abhas, Pathak, Kapil, Kavuru, Rajesh, Srinivasan, Prabhakar
This paper analyzes the performance of Small Language Models (SLMs) and Vision Language Models (VLMs) and evaluates the trade-off between model performance and carbon emissions across 4 essential tasks: Image Captioning, Visual Question Answering (VQA), Dialogue Summarization and Text-to-SQL conversion. Various SLMs and VLMs belonging to the Qwen and LLaMA architecture family are chosen and variants based on model size in terms of the number of parameters, quantization level and fine-tuning parameters are evaluated. The model variant's performance and carbon emissions are calculated. To quantify the trade-off between model performance and carbon emissions, we introduce a novel metric called CEGI (Carbon Efficient Gain Index). This metric represents the carbon emission per unit percentage gain per million trainable parameters . This metric provides a normalized measure to compare model's efficiency in terms of performance improvement relative to their environmental cost. The experiment's outcome demonstrates that fine-tuning SLMs and VLMs can achieve performance levels comparable to Large Language Models (LLMs) while producing significantly less carbon emissions. Our findings suggest that the marginal gains in accuracy from larger models do not justify the substantial increase in carbon emissions. Leveraging lower-bit quantization levels, the proposed metric further enhances energy efficiency without compromising performance. This study highlights balancing high performance and environmental sustainability. It offers a valuable metric for selecting models suitable for environmentally-friendly AI development.
- Asia (0.68)
- Europe (0.46)
- North America > United States > Minnesota (0.28)
Harnessing Your DRAM and SSD for Sustainable and Accessible LLM Inference with Mixed-Precision and Multi-level Caching
Peng, Jie, Cao, Zhang, Qu, Huaizhi, Zhang, Zhengyu, Guo, Chang, Zhang, Yanyong, Cao, Zhichao, Chen, Tianlong
Although Large Language Models (LLMs) have demonstrated remarkable capabilities, their massive parameter counts and associated extensive computing make LLMs' deployment the main part of carbon emission from nowadays AI applications. Compared to modern GPUs like H$100$, it would be significantly carbon-sustainable if we could leverage old-fashioned GPUs such as M$40$ (as shown in Figure 1, M$40$ only has one third carbon emission of H$100$'s) for LLM servings. However, the limited High Bandwidth Memory (HBM) available on such GPU often cannot support the loading of LLMs due to the gigantic model size and intermediate activation data, making their serving challenging. For instance, a LLaMA2 model with $70$B parameters typically requires $128$GB for inference, which substantially surpasses $24$GB HBM in a $3090$ GPU and remains infeasible even considering the additional $64$GB DRAM. To address this challenge, this paper proposes a mixed-precision with a model modularization algorithm to enable LLM inference on outdated hardware with resource constraints. (The precision denotes the numerical precision like FP16, INT8, INT4) and multi-level caching (M2Cache).) Specifically, our M2Cache first modulizes neurons in LLM and creates their importance ranking. Then, it adopts a dynamic sparse mixed-precision quantization mechanism in weight space to reduce computational demands and communication overhead at each decoding step. It collectively lowers the operational carbon emissions associated with LLM inference. Moreover, M2Cache introduces a three-level cache management system with HBM, DRAM, and SSDs that complements the dynamic sparse mixed-precision inference. To enhance communication efficiency, M2Cache maintains a neuron-level mixed-precision LRU cache in HBM, a larger layer-aware cache in DRAM, and a full model in SSD.
- North America > United States > New York > New York County > New York City (0.04)
- North America > United States > California > Santa Clara County > Santa Clara (0.04)
- North America > United States > North Carolina (0.04)
- (7 more...)