Goto

Collaborating Authors

 physics-informed neural network


A Comparative Investigation of Thermodynamic Structure-Informed Neural Networks

Li, Guojie, Hong, Liu

arXiv.org Machine Learning

Physics-informed neural networks (PINNs) offer a unified framework for solving both forward and inverse problems of differential equations, yet their performance and physical consistency strongly depend on how governing laws are incorporated. In this work, we present a systematic comparison of different thermodynamic structure-informed neural networks by incorporating various thermodynamics formulations, including Newtonian, Lagrangian, and Hamiltonian mechanics for conservative systems, as well as the Onsager variational principle and extended irreversible thermodynamics for dissipative systems. Through comprehensive numerical experiments on representative ordinary and partial differential equations, we quantitatively evaluate the impact of these formulations on accuracy, physical consistency, noise robustness, and interpretability. The results show that Newtonian-residual-based PINNs can reconstruct system states but fail to reliably recover key physical and thermodynamic quantities, whereas structure-preserving formulation significantly enhances parameter identification, thermodynamic consistency, and robustness. These findings provide practical guidance for principled design of thermodynamics-consistency model, and lay the groundwork for integrating more general nonequilibrium thermodynamic structures into physics-informed machine learning.










Hypernetwork-based Meta-Learning for Low-Rank Physics-Informed Neural Networks

Neural Information Processing Systems

PINNs are, however, sharing the same weakness with coordinate-based MLPs (or INRs), which hinders the application of PINNs/INRs to more diverse applications; for a new data instance (e.g., a new PDE for PINNs or a new image for INRs), training a new neural network (typically from