OmniJet-${\alpha_{ C}}$: Learning point cloud calorimeter simulations using generative transformers

Birk, Joschka, Gaede, Frank, Hallin, Anna, Kasieczka, Gregor, Mozzanica, Martina, Rose, Henning

arXiv.org Artificial Intelligence 

A foundation model is a machine learning model that has been pre-trained on a large amount of data, Machine learning (ML) methods have been a common and can then be fine-tuned for different downstream ingredient in particle physics research for a long tasks [61]. The idea behind utilizing pre-trained time, with neural networks being applied to object models is that their outputs can significantly enhance identification already in analyses at LEP [1]. Since the performance of downstream tasks, yielding then, the range of applications has grown drastically, better results than if the model were to be trained with ML methods being developed and used for from scratch. While the models mentioned above example in tagging [2-4], anomaly detection [5-8], have focused on exploring different tasks in specific individual reconstruction stages like particle tracking subdomains, like jet physics, a more ambitious goal [9-11] or even full event interpretation and reconstruction eventually would be to develop a foundation model [12]. Another important use case for for all tasks in all subdomains, including for example ML in high energy physics (HEP) is detector simulation.