Goto

Collaborating Authors

 hamiltonian vector field


Space filling positionality and the Spiroformer

Maurin, M., Evangelista-Alvarado, M. Á., Suárez-Serrato, P.

arXiv.org Artificial Intelligence

Transformers excel when dealing with sequential data. Generalizing transformer models to geometric domains, such as manifolds, we encounter the problem of not having a well-defined global order. We propose a solution with attention heads following a space-filling curve. As a first experimental example, we present the Spiroformer, a transformer that follows a polar spiral on the $2$-sphere.


SymFlux: deep symbolic regression of Hamiltonian vector fields

Evangelista-Alvarado, M. A., Suárez-Serrato, P.

arXiv.org Artificial Intelligence

We present SymFlux, a novel deep learning framework that performs symbolic regression to identify Hamiltonian functions from their corresponding vector fields on the standard symplectic plane. SymFlux models utilize hybrid CNN-LSTM architectures to learn and output the symbolic mathematical expression of the underlying Hamiltonian. Training and validation are conducted on newly developed datasets of Hamiltonian vector fields, a key contribution of this work. Our results demonstrate the model's effectiveness in accurately recovering these symbolic expressions, advancing automated discovery in Hamiltonian mechanics.


A Structure-Preserving Kernel Method for Learning Hamiltonian Systems

Hu, Jianyu, Ortega, Juan-Pablo, Yin, Daiying

arXiv.org Machine Learning

A structure-preserving kernel ridge regression method is presented that allows the recovery of potentially high-dimensional and nonlinear Hamiltonian functions out of datasets made of noisy observations of Hamiltonian vector fields. The method proposes a closed-form solution that yields excellent numerical performances that surpass other techniques proposed in the literature in this setup. From the methodological point of view, the paper extends kernel regression methods to problems in which loss functions involving linear functions of gradients are required and, in particular, a differential reproducing property and a Representer Theorem are proved in this context. The relation between the structure-preserving kernel estimator and the Gaussian posterior mean estimator is analyzed. A full error analysis is conducted that provides convergence rates using fixed and adaptive regularization parameters. The good performance of the proposed estimator is illustrated with various numerical experiments.