A novel data generation scheme for surrogate modelling with deep operator networks
Choubey, Shivam, Pal, Birupaksha, Agrawal, Manish
–arXiv.org Artificial Intelligence
However, due to intensive computational requirements, it is not feasible to deploy these techniques directly in numerous cases, such as parametric optimization, real-time prediction for control applications, etc. Machine learning-based surrogate models offer an alternate way for simulation of the physical systems in an efficient manner. Deep learning, due to its ability to model any arbitrary input-output relationship in an efficient manner is the most accepted choice for surrogate modelling. In general, these surrogate models are data driven models, where the simulation/experimental data is used for the training purpose. Once the surrogate model is trained, it can be used to predict the system output for unobserved data with minimal computational effort. For surrogate modelling, both vanilla and specialized neural networks such as convolution neural networks have gained immense popularity in both scientific as well as for industrial applications [1, 2]. Further, recently in [3], operator learning, a new paradigm in deep learning is proposed. In literature, various operator learning techniques are proposed, like deep operator networks (DeepONets)[4], Laplace Neural operators (LNO)[5], Fourier Neural operators (FNO)[6] and General Neural Operator Transformer for Operator learning (GNOT)[7]. In this paper, we focus on DeepONets as an operator learning technique and show a novel way on how to reduce the computational cost associated with training the model. DeepONet is based on the lesser known cousin of the'Universal Approximation
arXiv.org Artificial Intelligence
Feb-24-2024