parametric partial differential equation
Meta-Auto-Decoder for Solving Parametric Partial Differential Equations
Many important problems in science and engineering require solving the so-called parametric partial differential equations (PDEs), i.e., PDEs with different physical parameters, boundary conditions, shapes of computation domains, etc. Recently, building learning-based numerical solvers for parametric PDEs has become an emerging new field.
Multipole Graph Neural Operator for Parametric Partial Differential Equations
One of the main challenges in using deep learning-based methods for simulating physical systems and solving partial differential equations (PDEs) is formulating physics-based data in the desired structure for neural networks. Graph neural networks (GNNs) have gained popularity in this area since graphs offer a natural way of modeling particle interactions and provide a clear way of discretizing the continuum models. However, the graphs constructed for approximating such tasks usually ignore long-range interactions due to unfavorable scaling of the computational complexity with respect to the number of nodes.
FEDONet : Fourier-Embedded DeepONet for Spectrally Accurate Operator Learning
Sojitra, Arth, Dhingra, Mrigank, San, Omer
Deep Operator Networks (DeepONets) have recently emerged as powerful data-driven frameworks for learning nonlinear operators, particularly suited for approximating solutions to partial differential equations. Despite their promising capabilities, the standard implementation of DeepONets, which typically employs fully connected linear layers in the trunk network, can encounter limitations in capturing complex spatial structures inherent to various PDEs. To address this limitation, we introduce Fourier-Embedded trunk networks within the DeepONet architecture, leveraging random fourier feature mappings to enrich spatial representation capabilities. Our proposed Fourier-Embedded DeepONet, FEDONet demonstrates superior performance compared to the traditional DeepONet across a comprehensive suite of PDE-driven datasets, including the two-dimensional Poisson, Burgers', Lorenz-63, Eikonal, Allen-Cahn, and the Kuramoto-Sivashinsky equation. FEDONet delivers consistently superior reconstruction accuracy across all benchmark PDEs, with particularly large relative $L^2$ error reductions observed in chaotic and stiff systems. This study highlights the effectiveness of Fourier embeddings in enhancing neural operator learning, offering a robust and broadly applicable methodology for PDE surrogate modeling.
- North America > United States > Tennessee > Knox County > Knoxville (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Asia > India > Tripura (0.04)
S$^2$GPT-PINNs: Sparse and Small models for PDEs
Ji, Yajie, Chen, Yanlai, Koohy, Shawn
We propose S$^2$GPT-PINN, a sparse and small model for solving parametric partial differential equations (PDEs). Similar to Small Language Models (SLMs), S$^2$GPT-PINN is tailored to domain-specific (families of) PDEs and characterized by its compact architecture and minimal computational power. Leveraging a small amount of extremely high quality data via a mathematically rigorous greedy algorithm that is enabled by the large full-order models, S$^2$GPT-PINN relies on orders of magnitude less parameters than PINNs to achieve extremely high efficiency via two levels of customizations. The first is knowledge distillation via task-specific activation functions that are transferred from Pre-Trained PINNs. The second is a judicious down-sampling when calculating the physics-informed loss of the network compressing the number of data sites by orders of magnitude to the size of the small model.
- North America > United States > Pennsylvania > Philadelphia County > Philadelphia (0.14)
- North America > United States > Massachusetts > Bristol County > Dartmouth (0.14)
- Asia > China > Shanghai > Shanghai (0.04)
- Europe > Portugal > Braga > Braga (0.04)
- Education (0.46)
- Government (0.46)
Review for NeurIPS paper: Multipole Graph Neural Operator for Parametric Partial Differential Equations
Summary and Contributions: I have read the author's feedback. I really appreciate the time you took to answer us, reviewers. Rest assured that I have devoted a substantial amount of time looking at your paper and response. Unfortunately, the authors explanations do not add much to what I had already understood, and do not fully settle my concerns. On a few "subjective" matters: I disagree with the authors' response on "technicality" and "graphs" and I strongly suggest they take my advice into consideration in a future revision.
Review for NeurIPS paper: Multipole Graph Neural Operator for Parametric Partial Differential Equations
I agree with the authors that R1's concerns are not relevant to the acceptance decision and have removed their review from consideration. R2 raised the concern that there are insufficient benchmarks to judge the value of the work; the author rebuttal countered that the baselines identified by R2 do not attack the same use case as the proposed algorithm. I concur with this assessment. R2 also pointed out that the method was evaluated on 1d problems only; the authors rebutted that the method was demonstrated on both 1- and 2d problems, and gave an example of a Bayesian inverse problem that motivates this method even in low dimensions. R4 recommended accept because of the novelty of the proposed multipole graph neural operator.
Meta-Auto-Decoder for Solving Parametric Partial Differential Equations
Many important problems in science and engineering require solving the so-called parametric partial differential equations (PDEs), i.e., PDEs with different physical parameters, boundary conditions, shapes of computation domains, etc. Recently, building learning-based numerical solvers for parametric PDEs has become an emerging new field. They are typically unsupervised and mesh-free, but require going through the time-consuming network training process from scratch for each set of parameters of the PDE. Another category of methods such as Fourier Neural Operator (FNO) and Deep Operator Network (DeepONet) try to approximate the solution mapping directly. Being fast with only one forward inference for each PDE parameter without retraining, they often require a large corpus of paired input-output observations drawn from numerical simulations, and most of them need a predefined mesh as well.
Multipole Graph Neural Operator for Parametric Partial Differential Equations
One of the main challenges in using deep learning-based methods for simulating physical systems and solving partial differential equations (PDEs) is formulating physics-based data in the desired structure for neural networks. Graph neural networks (GNNs) have gained popularity in this area since graphs offer a natural way of modeling particle interactions and provide a clear way of discretizing the continuum models. However, the graphs constructed for approximating such tasks usually ignore long-range interactions due to unfavorable scaling of the computational complexity with respect to the number of nodes. Inspired by the classical multipole methods, we purpose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity. Our multi-level formulation is equivalent to recursively adding inducing points to the kernel matrix, unifying GNNs with multi-resolution matrix factorization of the kernel.
Random Grid Neural Processes for Parametric Partial Differential Equations
Vadeboncoeur, Arnaud, Kazlauskaite, Ieva, Papandreou, Yanni, Cirak, Fehmi, Girolami, Mark, Akyildiz, Ömer Deniz
We introduce a new class of spatially stochastic physics and data informed deep latent models for parametric partial differential equations (PDEs) which operate through scalable variational neural processes. We achieve this by assigning probability measures to the spatial domain, which allows us to treat collocation grids probabilistically as random variables to be marginalised out. Adapting this spatial statistics view, we solve forward and inverse problems for parametric PDEs in a way that leads to the construction of Gaussian process models of solution fields. The implementation of these random grids poses a unique set of challenges for inverse physics informed deep learning frameworks and we propose a new architecture called Grid Invariant Convolutional Networks (GICNets) to overcome these challenges. We further show how to incorporate noisy data in a principled manner into our physics informed model to improve predictions for problems where data may be available but whose measurement location does not coincide with any fixed mesh or grid. The proposed method is tested on a nonlinear Poisson problem, Burgers equation, and Navier-Stokes equations, and we provide extensive numerical comparisons. We demonstrate significant computational advantages over current physics informed neural learning methods for parametric PDEs while improving the predictive capabilities and flexibility of these models.
- Asia > India > Tripura (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- North America > United States > Hawaii > Honolulu County > Honolulu (0.04)
- (3 more...)