Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments

Zaverkin, Viktor, Holzmüller, David, Steinwart, Ingo, Kästner, Johannes

arXiv.org Machine Learning 

Approximate methods, such as empirical force fields (FFs) [1-3], are an integral part of modern computational chemistry and materials science. While the application of first-principles methods, such as density functional theory (DFT), to even moderately sized molecular and material systems is computationally very expensive, approximate methods allow for simulations of large systems over long time scales. During the last decades, machine-learned potentials (MLPs) [4-33] have risen in popularity due to their ability to be as accurate as the respective first principles reference methods, the transferability to arbitrary-sized systems, and the capability of describing bond breaking and bond formation as opposed to empirical FFs [34]. Interpolating abilities of neural networks (NNs) [35] promoted their broad application in computational chemistry and materials science. NNs were initially applied to represent potential energy surfaces (PESs) of small atomistic systems [36, 37] and were later extended to high-dimensional systems [21].