Merz, Garrett W.
Transforming the Bootstrap: Using Transformers to Compute Scattering Amplitudes in Planar N = 4 Super Yang-Mills Theory
Cai, Tianji, Merz, Garrett W., Charton, François, Nolte, Niklas, Wilhelm, Matthias, Cranmer, Kyle, Dixon, Lance J.
We pursue the use of deep learning methods to improve state-of-the-art computations in theoretical high-energy physics. Planar N = 4 Super Yang-Mills theory is a close cousin to the theory that describes Higgs boson production at the Large Hadron Collider; its scattering amplitudes are large mathematical expressions containing integer coefficients. In this paper, we apply Transformers to predict these coefficients. The problem can be formulated in a language-like representation amenable to standard cross-entropy training objectives. We design two related experiments and show that the model achieves high accuracy (> 98%) on both tasks. Our work shows that Transformers can be applied successfully to problems in theoretical physics that require exact solutions.
14 Examples of How LLMs Can Transform Materials Science and Chemistry: A Reflection on a Large Language Model Hackathon
Jablonka, Kevin Maik, Ai, Qianxiang, Al-Feghali, Alexander, Badhwar, Shruti, Bocarsly, Joshua D., Bran, Andres M, Bringuier, Stefan, Brinson, L. Catherine, Choudhary, Kamal, Circi, Defne, Cox, Sam, de Jong, Wibe A., Evans, Matthew L., Gastellu, Nicolas, Genzling, Jerome, Gil, María Victoria, Gupta, Ankur K., Hong, Zhi, Imran, Alishba, Kruschwitz, Sabine, Labarre, Anne, Lála, Jakub, Liu, Tao, Ma, Steven, Majumdar, Sauradeep, Merz, Garrett W., Moitessier, Nicolas, Moubarak, Elias, Mouriño, Beatriz, Pelkie, Brenden, Pieler, Michael, Ramos, Mayk Caldas, Ranković, Bojana, Rodriques, Samuel G., Sanders, Jacob N., Schwaller, Philippe, Schwarting, Marcus, Shi, Jiale, Smit, Berend, Smith, Ben E., Van Herck, Joren, Völker, Christoph, Ward, Logan, Warren, Sean, Weiser, Benjamin, Zhang, Sylvester, Zhang, Xiaoqi, Zia, Ghezal Ahmad, Scourtas, Aristana, Schmidt, KJ, Foster, Ian, White, Andrew D., Blaiszik, Ben
Large-language models (LLMs) such as GPT-4 caught the interest of many scientists. Recent studies suggested that these models could be useful in chemistry and materials science. To explore these possibilities, we organized a hackathon. This article chronicles the projects built as part of this hackathon. Participants employed LLMs for various applications, including predicting properties of molecules and materials, designing novel interfaces for tools, extracting knowledge from unstructured data, and developing new educational applications. The diverse topics and the fact that working prototypes could be generated in less than two days highlight that LLMs will profoundly impact the future of our fields. The rich collection of ideas and projects also indicates that the applications of LLMs are not limited to materials science and chemistry but offer potential benefits to a wide range of scientific disciplines.