Goto

Collaborating Authors

 pinn


Lightweight Geometric Adaptation for Training Physics-Informed Neural Networks

An, Kang, Si, Chenhao, Ma, Shiqian, Yan, Ming

arXiv.org Machine Learning

Physics-Informed Neural Networks (PINNs) often suffer from slow convergence, training instability, and reduced accuracy on challenging partial differential equations due to the anisotropic and rapidly varying geometry of their loss landscapes. We propose a lightweight curvature-aware optimization framework that augments existing first-order optimizers with an adaptive predictive correction based on secant information. Consecutive gradient differences are used as a cheap proxy for local geometric change, together with a step-normalized secant curvature indicator to control the correction strength. The framework is plug-and-play, computationally efficient, and broadly compatible with existing optimizers, without explicitly forming second-order matrices. Experiments on diverse PDE benchmarks show consistent improvements in convergence speed, training stability, and solution accuracy over standard optimizers and strong baselines, including on the high-dimensional heat equation, Gray--Scott system, Belousov--Zhabotinsky system, and 2D Kuramoto--Sivashinsky system.






Hypernetwork-based Meta-Learning for Low-Rank Physics-Informed Neural Networks

Neural Information Processing Systems

PINNs are, however, sharing the same weakness with coordinate-based MLPs (or INRs), which hinders the application of PINNs/INRs to more diverse applications; for a new data instance (e.g., a new PDE for PINNs or a new image for INRs), training a new neural network (typically from




IsL2Physics-InformedLossAlwaysSuitablefor TrainingPhysics-InformedNeuralNetwork?

Neural Information Processing Systems

In particular, we leverage the concept of stability in the literature of partial differential equation tostudy the asymptotic behavior ofthe learned solution asthe loss approaches zero. Withthis concept, we study animportant class of high-dimensional non-linear PDEs in optimal control, the Hamilton-JacobiBellman (HJB) Equation, and provethat for generalLp Physics-Informed Loss, a wide class of HJB equation is stable only ifp is sufficiently large.


Nonparametric Boundary Geometry in Physics Informed Deep Learning

Neural Information Processing Systems

Engineering design problems frequently require solving systems of partial differential equations with boundary conditions specified on object geometries in the form of a triangular mesh. These boundary geometries are provided by a designer and are problem dependent. The efficiency of the design process greatly benefits from fast turnaround times when repeatedly solving PDEs on various geometries. However, most current work that uses machine learning to speed up the solution process relies heavily on a fixed parameterization of the geometry, which cannot be changed after training. This severely limits the possibility of reusing a trained model across a variety of design problems. In this work, we propose a novel neural operator architecture which accepts boundary geometry, in the form of triangular meshes, as input and produces an approximate solution to a given PDE as output. Once trained, the model can be used to rapidly estimate the PDE solution over a new geometry, without the need for retraining or representation of the geometry with a pre-specified parameterization.