Not enough data to create a plot.
Try a different view from the menu above.
Jiang, Zhixing
Ultra Fast Transformers on FPGAs for Particle Physics Experiments
Jiang, Zhixing, Yin, Dennis, Khoda, Elham E, Loncar, Vladimir, Govorkova, Ekaterina, Moreno, Eric, Harris, Philip, Hauck, Scott, Hsu, Shih-Chieh
This work introduces a highly efficient implementation of the transformer architecture on a Field-Programmable Gate Array (FPGA) by using the \texttt{hls4ml} tool. Given the demonstrated effectiveness of transformer models in addressing a wide range of problems, their application in experimental triggers within particle physics becomes a subject of significant interest. In this work, we have implemented critical components of a transformer model, such as multi-head attention and softmax layers. To evaluate the effectiveness of our implementation, we have focused on a particle physics jet flavor tagging problem, employing a public dataset. We recorded latency under 2 $\mu$s on the Xilinx UltraScale+ FPGA, which is compatible with hardware trigger requirements at the CERN Large Hadron Collider experiments.