Active learning for data-driven reduced models of parametric differential systems with Bayesian operator inference

McQuarrie, Shane A., Guo, Mengwu, Chaudhuri, Anirban

arXiv.org Machine Learning 

Numerical simulation of complex physical phenomena is a core enabling technology for digital twins, which are comprised of physical and virtual assets with a two-way flow of information: data from the physical asset is used to construct and/or calibrate the virtual asset (a numerical model), while numerical predictions from the virtual asset are used for control or decision-making for the physical asset [42]. To be viable for practical application, the virtual asset must be able to produce predictions rapidly and reliably; however, the underlying physics that are of interest for digital twin applications can typically only be accurately simulated using a large number of degrees of freedom, leading to computationally expensive numerical simulations. The explainability and computational efficiency of decisions made by the digital twin play a key role in safety-critical applications, making explainable artificial intelligence an essential ingredient [24]. Model reduction techniques are one such explainable scientific machine learning technique that construct low-dimensional systems, called reduced-order models (ROMs), to serve as computationally inexpensive surrogates for a high-dimensional physics simulation [4, 20]. This paper introduces a technique for adaptively constructing ROMs to emulate systems with parametric dependence, that is, systems whose behavior varies with some set of parameters, usually representing physical properties. We focus on systems where the parametric dependence manifests in the operators defining the model, not merely in initial conditions or external inputs.