Goto

Collaborating Authors

 Hackett, Daniel C.


Flow-based sampling for multimodal and extended-mode distributions in lattice field theory

arXiv.org Artificial Intelligence

Recent results have demonstrated that samplers constructed with flow-based generative models are a promising new approach for configuration generation in lattice field theory. In this paper, we present a set of training- and architecture-based methods to construct flow models for targets with multiple separated modes (i.e.~vacua) as well as targets with extended/continuous modes. We demonstrate the application of these methods to modeling two-dimensional real and complex scalar field theories in their symmetry-broken phases. In this context we investigate different flow-based sampling algorithms, including a composite sampling algorithm where flow-based proposals are occasionally augmented by applying updates using traditional algorithms like HMC.


Machine Learning Neutrino-Nucleus Cross Sections

arXiv.org Artificial Intelligence

Neutrino-nucleus scattering cross sections are critical theoretical inputs for long-baseline neutrino oscillation experiments. However, robust modeling of these cross sections remains challenging. For a simple but physically motivated toy model of the DUNE experiment, we demonstrate that an accurate neural-network model of the cross section -- leveraging Standard Model symmetries -- can be learned from near-detector data. We then perform a neutrino oscillation analysis with simulated far-detector events, finding that the modeled cross section achieves results consistent with what could be obtained if the true cross section were known exactly. This proof-of-principle study highlights the potential of future neutrino near-detector datasets and data-driven cross-section models.


Practical applications of machine-learned flows on gauge fields

arXiv.org Artificial Intelligence

Numerical lattice quantum chromodynamics (QCD) is an integral part of the modern particle and nuclear theory toolkit [1-9]. In this framework, the discretized path integral is computed using Monte Carlo methods. Computationally, this is very expensive, and grows more so as physical limits of interest are approached [10-12]. Consequently, algorithmic developments are an important driver of progress. For example, resolving topological freezing [12-14]--an issue that arises in sampling gauge field configurations with state-of-the-art Markov chain Monte Carlo (MCMC) algorithms like heatbath [15-19] or Hybrid/Hamiltonian Monte Carlo (HMC) [20-22]--would provide access to finer lattice spacings than presently affordable. To such ends, recent work has explored how emerging machine learning (ML) techniques may be applied to lattice QCD [23, 24]. Of particular interest has been the possibility of accelerating gauge-field sampling [25-34] using normalizing flows [35-37], a class of generative statistical models with tractable density functions. In this framework, a flow is a learned, invertible (diffeomorphic) map between gauge fields. Abstractly, flows may be thought of as bridges between different distributions over gauge fields (or, equivalently, different theories or choices of action parameters).


Applications of flow models to the generation of correlated lattice QCD ensembles

arXiv.org Artificial Intelligence

Machine-learned normalizing flows can be used in the context of lattice quantum field theory to generate statistically correlated ensembles of lattice gauge fields at different action parameters. This work demonstrates how these correlations can be exploited for variance reduction in the computation of observables. Three different proof-of-concept applications are demonstrated using a novel residual flow architecture: continuum limits of gauge theories, the mass dependence of QCD observables, and hadronic matrix elements based on the Feynman-Hellmann approach. In all three cases, it is shown that statistical uncertainties are significantly reduced when machine-learned flows are incorporated as compared with the same calculations performed with uncorrelated ensembles or direct reweighting.


Normalizing flows for lattice gauge theory in arbitrary space-time dimension

arXiv.org Artificial Intelligence

Applications of normalizing flows to the sampling of field configurations in lattice gauge theory have so far been explored almost exclusively in two space-time dimensions. We report new algorithmic developments of gauge-equivariant flow architectures facilitating the generalization to higher-dimensional lattice geometries. Specifically, we discuss masked autoregressive transformations with tractable and unbiased Jacobian determinants, a key ingredient for scalable and asymptotically exact flow-based sampling algorithms. For concreteness, results from a proof-of-principle application to SU(3) lattice gauge theory in four space-time dimensions are reported.


Aspects of scaling and scalability for flow-based sampling of lattice QCD

arXiv.org Artificial Intelligence

Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing. However, these demonstrations have been at the scale of toy models, and it remains to be determined whether they can be applied to state-of-the-art lattice quantum chromodynamics calculations. Assessing the viability of sampling algorithms for lattice field theory at scale has traditionally been accomplished using simple cost scaling laws, but as we discuss in this work, their utility is limited for flow-based approaches. We conclude that flow-based approaches to sampling are better thought of as a broad family of algorithms with different scaling properties, and that scalability must be assessed experimentally.


Gauge-equivariant flow models for sampling in lattice field theories with pseudofermions

arXiv.org Artificial Intelligence

Specifically, computing the probability density after the fermionic integration via direct methods is not feasible for at-scale studies of theories such as QCD, as such methods Lattice quantum field theory (LQFT), particularly lattice scale cubically with the spacetime volume. The usual quantum chromodynamics, has become an ubiquitous approach to this challenge is to introduce auxiliary degrees tool in high-energy and nuclear theory [1-4]. Given of freedom, named pseudofermions, which function the extraordinary computational cost of state-of-the-art as stochastic determinant estimators for which the cost LQFT studies, advances in the form of more efficient algorithms of evaluation scales more favorably with the lattice volume.


Neural-network preconditioners for solving the Dirac equation in lattice gauge theory

arXiv.org Artificial Intelligence

This work develops neural-network--based preconditioners to accelerate solution of the Wilson-Dirac normal equation in lattice quantum field theories. The approach is implemented for the two-flavor lattice Schwinger model near the critical point. In this system, neural-network preconditioners are found to accelerate the convergence of the conjugate gradient solver compared with the solution of unpreconditioned systems or those preconditioned with conventional approaches based on even-odd or incomplete Cholesky decompositions, as measured by reductions in the number of iterations and/or complex operations required for convergence. It is also shown that a preconditioner trained on ensembles with small lattice volumes can be used to construct preconditioners for ensembles with many times larger lattice volumes, with minimal degradation of performance. This volume-transferring technique amortizes the training cost and presents a pathway towards scaling such preconditioners to lattice field theory calculations with larger lattice volumes and in four dimensions.


Sampling using $SU(N)$ gauge equivariant flows

arXiv.org Machine Learning

In Ref. [11], this approach was demonstrated in the Gauge theories based on SU(N) or U(N) groups describe context of U(1) gauge theory. Here, we develop a class of many aspects of nature. For example, the Standard kernels for SU(N) group elements (and describe a similar Model of nuclear and particle physics is a nonabelian construction for U(N) group elements). We show that if gauge theory with the symmetry group U(1) an invertible transformation acts only on the eigenvalues SU(2) SU(3), candidate theories for physics beyond the of a matrix and is equivariant under permutation of those Standard Model can be defined based on strongly interacting eigenvalues, then it is equivariant under matrix conjugation SU(N) gauge theories [1, 2], SU(N) gauge symmetries and may be used as a kernel. Moreover, by making emerge in various condensed matter systems [3-7], a connection to the maximal torus within the group and and SU(N) and U(N) gauge symmetries feature in the to the Weyl group of the root system, we show that this low energy limit of certain string-theory vacua [8]. In is in fact a universal way to define a kernel for unitary the context of the rapidly-developing area of machinelearning groups.