A Lorentz-Equivariant Transformer for All of the LHC
Brehmer, Johann, Bresó, Víctor, de Haan, Pim, Plehn, Tilman, Qu, Huilin, Spinner, Jonas, Thaler, Jesse
–arXiv.org Artificial Intelligence
We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields state-of-the-art performance for a wide range of machine learning tasks at the Large Hadron Collider. L-GATr represents data in a geometric algebra over space-time and is equivariant under Lorentz transformations. The underlying architecture is a versatile and scalable transformer, which is able to break symmetries if needed. We demonstrate the power of L-GATr for amplitude regression and jet classification, and then benchmark it as the first Lorentz-equivariant generative network. For all three LHC tasks, we find significant improvements over previous architectures.
arXiv.org Artificial Intelligence
Dec-22-2024
- Country:
- Europe > Germany
- Baden-Württemberg (0.14)
- North America > United States
- Massachusetts (0.14)
- Europe > Germany
- Genre:
- Research Report (0.64)
- Industry:
- Energy (0.46)
- Government > Regional Government (0.46)
- Technology: