GITO: Graph-Informed Transformer Operator for Learning Complex Partial Differential Equations
Ramezankhani, Milad, Patel, Janak M., Deodhar, Anirudh, Birru, Dagnachew
–arXiv.org Artificial Intelligence
We present a novel graph-informed transformer operator (GITO) architecture for learning complex partial differential equation systems defined on irregular geometries and non-uniform meshes. GITO consists of two main modules: a hybrid graph transformer (HGT) and a transformer neural operator (TNO). HGT leverages a graph neural network (GNN) to encode local spatial relationships and a transformer to capture long-range dependencies. A self-attention fusion layer integrates the outputs of the GNN and transformer to enable more expressive feature learning on graph-structured data. TNO module employs linear-complexity cross-attention and self-attention layers to map encoded input functions to predictions at arbitrary query locations, ensuring discretization invariance and enabling zero-shot super-resolution across any mesh. Empirical results on benchmark PDE tasks demonstrate that GITO outperforms existing transformer-based neural operators, paving the way for efficient, mesh-agnostic surrogate solvers in engineering applications.
arXiv.org Artificial Intelligence
Jun-18-2025
- Country:
- Europe > Slovenia
- Drava > Municipality of Benedikt > Benedikt (0.04)
- North America > United States
- Massachusetts > Middlesex County > Marlborough (0.05)
- Europe > Slovenia
- Genre:
- Research Report > New Finding (0.46)
- Technology: