climate modeling
Learning to generate physical ocean states: Towards hybrid climate modeling
Meunier, Etienne, Kamm, David, Gachon, Guillaume, Lguensat, Redouane, Deshayes, Julie
Ocean General Circulation Models require extensive computational resources to reach equilibrium states, while deep learning emulators, despite offering fast predictions, lack the physical interpretability and long-term stability necessary for climate scientists to understand climate sensitivity (to greenhouse gas emissions) and mechanisms of abrupt % variability such as tipping points. We propose to take the best from both worlds by leveraging deep generative models to produce physically consistent oceanic states that can serve as initial conditions for climate projections. We assess the viability of this hybrid approach through both physical metrics and numerical experiments, and highlight the benefits of enforcing physical constraints during generation. Although we train here on ocean variables from idealized numerical simulations, we claim that this hybrid approach, combining the computational efficiency of deep learning with the physical accuracy of numerical models, can effectively reduce the computational burden of running climate models to equilibrium, and reduce uncertainties in climate projections by minimizing drifts in baseline simulations.
- Europe > France > Île-de-France > Paris > Paris (0.05)
- Southern Ocean (0.04)
Efficient Localized Adaptation of Neural Weather Forecasting: A Case Study in the MENA Region
Munir, Muhammad Akhtar, Khan, Fahad Shahbaz, Khan, Salman
Accurate weather and climate modeling is critical for both scientific advancement and safeguarding communities against environmental risks. Traditional approaches rely heavily on Numerical Weather Prediction (NWP) models, which simulate energy and matter flow across Earth's systems. However, heavy computational requirements and low efficiency restrict the suitability of NWP, leading to a pressing need for enhanced modeling techniques. Neural network-based models have emerged as promising alternatives, leveraging data-driven approaches to forecast atmospheric variables. In this work, we focus on limited-area modeling and train our model specifically for localized region-level downstream tasks. As a case study, we consider the MENA region due to its unique climatic challenges, where accurate localized weather forecasting is crucial for managing water resources, agriculture and mitigating the impacts of extreme weather events. This targeted approach allows us to tailor the model's capabilities to the unique conditions of the region of interest. Our study aims to validate the effectiveness of integrating parameter-efficient fine-tuning (PEFT) methodologies, specifically Low-Rank Adaptation (LoRA) and its variants, to enhance forecast accuracy, as well as training speed, computational resource utilization, and memory efficiency in weather and climate modeling for specific regions.
- Africa > North Africa (0.05)
- Europe > Middle East (0.05)
- Asia > Middle East > Kuwait (0.05)
- (3 more...)
AQ-PINNs: Attention-Enhanced Quantum Physics-Informed Neural Networks for Carbon-Efficient Climate Modeling
Dutta, Siddhant, Innan, Nouhaila, Yahia, Sadok Ben, Shafique, Muhammad
The growing computational demands of artificial intelligence (AI) in addressing climate change raise significant concerns about inefficiencies and environmental impact, as highlighted by the Jevons paradox. We propose an attention-enhanced quantum physics-informed neural networks model (AQ-PINNs) to tackle these challenges. This approach integrates quantum computing techniques into physics-informed neural networks (PINNs) for climate modeling, aiming to enhance predictive accuracy in fluid dynamics governed by the Navier-Stokes equations while reducing the computational burden and carbon footprint. By harnessing variational quantum multi-head self-attention mechanisms, our AQ-PINNs achieve a 51.51% reduction in model parameters compared to classical multi-head self-attention methods while maintaining comparable convergence and loss. It also employs quantum tensor networks to enhance representational capacity, which can lead to more efficient gradient computations and reduced susceptibility to barren plateaus. Our AQ-PINNs represent a crucial step towards more sustainable and effective climate modeling solutions.
Automating the math for decision-making under uncertainty
One reason deep learning exploded over the last decade was the availability of programming languages that could automate the math -- college-level calculus -- that is needed to train each new model. Neural networks are trained by tuning their parameters to try to maximize a score that can be rapidly calculated for training data. The equations used to adjust the parameters in each tuning step used to be derived painstakingly by hand. Deep learning platforms use a method called automatic differentiation to calculate the adjustments automatically. This allowed researchers to rapidly explore a huge space of models, and find the ones that really worked, without needing to know the underlying math.
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.40)
- North America > United States > Illinois (0.05)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.05)
- Social Sector (0.52)
- Government (0.33)
GitHub - blutjens/awesome-MIT-ai-for-climate-change: 🌍 A curated list of MIT profs that tackle climate change with machine learning for applying students, undergraduates, or others
Finding professors in machine learning and climate change is difficult, because they are spread across various departments and research a wide breadth of optics. Whether you're applying for graduate school, look for collaborators, or inspiring projects - this list is intended to get you started by finding the right people. This is a safe, open, and inclusive community. The list is most surely incomplete, so please add your favorite professors through commenting in an issue or creating a pull request. Students in CCML include Vincent Meijer.
Nvidia Speeds AI, Climate Modeling - AI Summary
It's been years since developers found that Nvidia's main product, the GPU, was useful not just for rendering video games but also for high-performance computing of the kind used in 3D modeling, weather forecasting, or the training of AI models--and it's on enterprise applications such as those that CEO Jensen Huang will focus his attention at the company's GTC 2022 conference this week. For some applications, a simple database may suffice to record a product's service history--when it was made, who it shipped to, what modifications have been applied--while others require a full-on 3D model incorporating real-time sensor data that can be used, for example, to provide advanced warning of component failure or of rain. Two groups of researchers are already using Nvidia's Modulus AI framework for developing physics machine learning models and its Omniverse 3D virtual world simulation platform to forecast the weather with greater confidence and speed, and to optimize the design of wind farms. To help other enterprises build and maintain their own digital twins, later this year Nvidia will offer OVX computing systems running its Omniverse software on racks loaded with its GPUs, storage, and high-speed switch fabric. The option to securely process such data on a GPU, even in a public cloud or a colocation facility, could enable enterprises to speed up the development and use of machine learning models without scaling up capital spending. It's been years since developers found that Nvidia's main product, the GPU, was useful not just for rendering video games but also for high-performance computing of the kind used in 3D modeling, weather forecasting, or the training of AI models--and it's on enterprise applications such as those that CEO Jensen Huang will focus his attention at the company's GTC 2022 conference this week.
Nvidia speeds AI, climate modeling
It's been years since developers found that Nvidia's main product, the GPU, was useful not just for rendering video games but also for high-performance computing of the kind used in 3D modeling, weather forecasting, or the training of AI models--and it's on enterprise applications such as those that CEO Jensen Huang will focus his attention at the company's GTC 2022 conference this week. Nvidia is hoping to make it easier for CIOs building digital twins and machine learning models to secure enterprise computing, and even to speed the adoption of quantum computing with a range of new hardware and software. Digital twins, numerical models that reflect changes in real-world objects useful in design, manufacturing, and service creation, vary in their level of detail. For some applications, a simple database may suffice to record a product's service history--when it was made, who it shipped to, what modifications have been applied--while others require a full-on 3D model incorporating real-time sensor data that can be used, for example, to provide advanced warning of component failure or of rain. It's at the high end of that range that Nvidia plays.
- Information Technology > Hardware (1.00)
- Energy (0.72)
Opportunities and limits of AI in climate modeling
Earth system models are the most important tools for quantitatively describing the physical state of Earth, and--for example, in the context of climate models--predicting how it might change in the future under the influence of human activities. How the increasingly used methods of artificial intelligence (AI) can help to improve these forecasts and where the limits of the two approaches lie has now been investigated by an international team led by Christopher Irrgang from the German Research Centre for Geosciences Potsdam (GFZ) in a Perspectives article for the journal Nature Machine Intelligence. One key proposal: To merge both approaches into a self-learning "neural Earth system modeling." The development of Earth is a complex interplay of many factors, including the land surface with flora and fauna, the oceans with their ecosystem, the polar regions, the atmosphere, the carbon cycle and other biogeochemical cycles, and radiation processes. Researchers therefore speak of the Earth system.
Why are Climate models written in programming languages from 1950?
Recently, a friend sent me a Wired article entitled "The Power and Paradox of Bad Software". The short piece, written by Paul Ford, discusses the idea that the software industry might be too obsessed with creating better and better tools for itself while neglecting mundane software such as resource scheduling systems or online library catalogs. The author claims that the winners of the bad software lottery are the computational scientists that develop our climate models. Since climate change might be one of the biggest problems for the next generation, some might find it a bit worrying if one of our best tools for examining climate change was written with "bad software". In this post, I discuss the question of wether climate scientists lost the "bad software sweepstakes". I'll cover the basics of climate models, what software is commonly used in climate modeling and why, and what alternative software exists. Best I can tell, the bad software sweepstakes has been won (or lost) by climate change folks.