Goto

Collaborating Authors

 Deck, Katherine


A Physics-Constrained Neural Differential Equation Framework for Data-Driven Snowpack Simulation

arXiv.org Artificial Intelligence

This paper presents a physics-constrained neural differential equation framework for parameterization, and employs it to model the time evolution of seasonal snow depth given hydrometeorological forcings. When trained on data from multiple SNOTEL sites, the parameterization predicts daily snow depth with under 9% median error and Nash Sutcliffe Efficiencies over 0.94 across a wide variety of snow climates. The parameterization also generalizes to new sites not seen during training, which is not often true for calibrated snow models. Requiring the parameterization to predict snow water equivalent in addition to snow depth only increases error to ~12%. The structure of the approach guarantees the satisfaction of physical constraints, enables these constraints during model training, and allows modeling at different temporal resolutions without additional retraining of the parameterization. These benefits hold potential in climate modeling, and could extend to other dynamical systems with physical constraints.


Toward Routing River Water in Land Surface Models with Recurrent Neural Networks

arXiv.org Artificial Intelligence

Machine learning is playing an increasing role in hydrology, supplementing or replacing physics-based models. One notable example is the use of recurrent neural networks (RNNs) for forecasting streamflow given observed precipitation and geographic characteristics. Training of such a model over the continental United States has demonstrated that a single set of model parameters can be used across independent catchments, and that RNNs can outperform physics-based models. In this work, we take a next step and study the performance of RNNs for river routing in land surface models (LSMs). Instead of observed precipitation, the LSM-RNN uses instantaneous runoff calculated from physics-based models as an input. We train the model with data from river basins spanning the globe and test it in streamflow hindcasts. The model demonstrates skill at generalization across basins (predicting streamflow in unseen catchments) and across time (predicting streamflow during years not used in training). We compare the predictions from the LSM-RNN to an existing physics-based model calibrated with a similar dataset and find that the LSM-RNN outperforms the physics-based model. Our results give further evidence that RNNs are effective for global streamflow prediction from runoff inputs and motivate the development of complete routing models that can capture nested sub-basis connections.


Response Theory via Generative Score Modeling

arXiv.org Artificial Intelligence

We introduce an approach for analyzing the responses of dynamical systems to external perturbations that combines score-based generative modeling with the Fluctuation-Dissipation Theorem (FDT). The methodology enables accurate estimation of system responses, especially for systems with non-Gaussian statistics, often encountered in dynamical systems far from equilibrium. Such cases often present limitations for conventional approximate methods. We numerically validate our approach using time-series data from a stochastic partial differential equation where the score function is available analytically. Furthermore, we demonstrate the improved accuracy of our methodology over conventional methods and its potential as a versatile tool for understanding complex dynamical systems. Applications span disciplines from climate science and finance to neuroscience.


Easing Color Shifts in Score-Based Diffusion Models

arXiv.org Artificial Intelligence

However, without specific adjustments to the network architecture or training implementation, these models Generated images of score-based models can suffer can produce errors in the generated images referred to from errors in their spatial means, an effect, referred as "color shifts" [14]. In this work, we demonstrate that to as a color shift, which grows for larger images. This the color shift is primarily an error in the spatial mean paper investigates a previously-introduced approach to of the generated images. We investigate a simple and mitigate color shifts in score-based diffusion models. We effective method [2] for improving these color shifts in quantify the performance of a nonlinear bypass connection generated images. We demonstrate how this method in the score network, designed to process the works using the FashionMNIST dataset [20] and using spatial mean of the input and to predict the mean of snapshots from a high-resolution dynamical simulation the score function. We show that this network architecture of two-dimensional forced turbulent fluid flow [2]. We substantially improves the resulting quality of the contrast this method to other solutions in terms of ease generated images, and that this improvement is approximately of implementation and performance as a function of independent of the size of the generated images.


Unpaired Downscaling of Fluid Flows with Diffusion Bridges

arXiv.org Artificial Intelligence

We present a method to downscale idealized geophysical fluid simulations using generative models based on diffusion maps. By analyzing the Fourier spectra of images drawn from different data distributions, we show how one can chain together two independent conditional diffusion models for use in domain translation. The resulting transformation is a diffusion bridge between a low resolution and a high resolution dataset and allows for new sample generation of high-resolution images given specific low resolution features. The ability to generate new samples allows for the computation of any statistic of interest, without any additional calibration or training. Our unsupervised setup is also designed to downscale images without access to paired training data; this flexibility allows for the combination of multiple source and target domains without additional training. We demonstrate that the method enhances resolution and corrects context-dependent biases in geophysical fluid simulations, including in extreme events. We anticipate that the same method can be used to downscale the output of climate simulations, including temperature and precipitation fields, without needing to train a new model for each application and providing a significant computational cost savings.