Becker, Charlie
Uncertainty Quantification of Wind Gust Predictions in the Northeast US: An Evidential Neural Network and Explainable Artificial Intelligence Approach
Jahan, Israt, Schreck, John S., Gagne, David John, Becker, Charlie, Astitha, Marina
Machine learning has shown promise in reducing bias in numerical weather model predictions of wind gusts. Yet, they underperform to predict high gusts even with additional observations due to the right-skewed distribution of gusts. Uncertainty quantification (UQ) addresses this by identifying when predictions are reliable or needs cautious interpretation. Using data from 61 extratropical storms in the Northeastern USA, we introduce evidential neural network (ENN) as a novel approach for UQ in gust predictions, leveraging atmospheric variables from the Weather Research and Forecasting (WRF) model as features and gust observations as targets. Explainable artificial intelligence (XAI) techniques demonstrated that key predictive features also contributed to higher uncertainty. Estimated uncertainty correlated with storm intensity and spatial gust gradients. ENN allowed constructing gust prediction intervals without requiring an ensemble. From an operational perspective, providing gust forecasts with quantified uncertainty enhances stakeholders' confidence in risk assessment and response planning for extreme gust events.
Evidential Deep Learning: Enhancing Predictive Uncertainty Estimation for Earth System Science Applications
Schreck, John S., Gagne, David John II, Becker, Charlie, Chapman, William E., Elmore, Kim, Gantos, Gabrielle, Kim, Eliot, Kimpara, Dhamma, Martin, Thomas, Molina, Maria J., Pryzbylo, Vanessa M., Radford, Jacob, Saavedra, Belen, Willson, Justin, Wirz, Christopher
Robust quantification of predictive uncertainty is critical for understanding factors that drive weather and climate outcomes. Ensembles provide predictive uncertainty estimates and can be decomposed physically, but both physics and machine learning ensembles are computationally expensive. Parametric deep learning can estimate uncertainty with one model by predicting the parameters of a probability distribution but do not account for epistemic uncertainty.. Evidential deep learning, a technique that extends parametric deep learning to higher-order distributions, can account for both aleatoric and epistemic uncertainty with one model. This study compares the uncertainty derived from evidential neural networks to those obtained from ensembles. Through applications of classification of winter precipitation type and regression of surface layer fluxes, we show evidential deep learning models attaining predictive accuracy rivaling standard methods, while robustly quantifying both sources of uncertainty. We evaluate the uncertainty in terms of how well the predictions are calibrated and how well the uncertainty correlates with prediction error. Analyses of uncertainty in the context of the inputs reveal sensitivities to underlying meteorological processes, facilitating interpretation of the models. The conceptual simplicity, interpretability, and computational efficiency of evidential neural networks make them highly extensible, offering a promising approach for reliable and practical uncertainty quantification in Earth system science modeling. In order to encourage broader adoption of evidential deep learning in Earth System Science, we have developed a new Python package, MILES-GUESS (https://github.com/ai2es/miles-guess), that enables users to train and evaluate both evidential and ensemble deep learning.