Carleo, Giuseppe
Ab-initio variational wave functions for the time-dependent many-electron Schr\"odinger equation
Nys, Jannes, Pescia, Gabriel, Carleo, Giuseppe
Describing the dynamics of many-electron quantum systems is crucial for applications such as predicting electronic structures in quantum chemistry, the properties of condensed matter systems, and the behaviors of complex materials. However, the real-time evolution of non-equilibrium quantum electronic systems poses a significant challenge for theoretical and computational approaches, due to the system's exploration of a vast configuration space. This work introduces a variational approach for fermionic time-dependent wave functions, surpassing mean-field approximations by capturing many-body correlations. The proposed methodology involves parameterizing the time-evolving quantum state, enabling the approximation of the state's evolution. To account for electron correlations, we employ time-dependent Jastrow factors and backflow transformations. We also show that we can incorporate neural networks to parameterize these functions. The time-dependent variational Monte Carlo technique is employed to efficiently compute the optimal time-dependent parameters. The approach is demonstrated in three distinct systems: the solvable harmonic interaction model, the dynamics of a diatomic molecule in intense laser fields, and a quenched quantum dot. In all cases, we show clear signatures of many-body correlations in the dynamics not captured by mean-field methods. The results showcase the ability of our variational approach to accurately capture the time evolution of quantum states, providing insight into the quantum dynamics of interacting electronic systems, beyond the capabilities of mean-field.
Learning ground states of gapped quantum Hamiltonians with Kernel Methods
Giuliani, Clemens, Vicentini, Filippo, Rossi, Riccardo, Carleo, Giuseppe
Neural network approaches to approximate the ground state of quantum hamiltonians require the numerical solution of a highly nonlinear optimization problem. We introduce a statistical learning approach that makes the optimization trivial by using kernel methods. Our scheme is an approximate realization of the power method, where supervised learning is used to learn the next step of the power iteration. We show that the ground state properties of arbitrary gapped quantum hamiltonians can be reached with polynomial resources under the assumption that the supervised learning is efficient. Using kernel ridge regression, we provide numerical evidence that the learning assumption is verified by applying our scheme to find the ground states of several prototypical interacting many-body quantum systems, both in one and two dimensions, showing the flexibility of our approach.
Hybrid Ground-State Quantum Algorithms based on Neural Schr\"odinger Forging
de Schoulepnikoff, Paulin, Kiss, Oriel, Vallecorsa, Sofia, Carleo, Giuseppe, Grossi, Michele
Entanglement forging based variational algorithms leverage the bi-partition of quantum systems for addressing ground state problems. The primary limitation of these approaches lies in the exponential summation required over the numerous potential basis states, or bitstrings, when performing the Schmidt decomposition of the whole system. To overcome this challenge, we propose a new method for entanglement forging employing generative neural networks to identify the most pertinent bitstrings, eliminating the need for the exponential sum. Through empirical demonstrations on systems of increasing complexity, we show that the proposed algorithm achieves comparable or superior performance compared to the existing standard implementation of entanglement forging. Moreover, by controlling the amount of required resources, this scheme can be applied to larger, as well as non permutation invariant systems, where the latter constraint is associated with the Heisenberg forging procedure. We substantiate our findings through numerical simulations conducted on spins models exhibiting one-dimensional ring, two-dimensional triangular lattice topologies, and nuclear shell model configurations.
Empirical Sample Complexity of Neural Network Mixed State Reconstruction
Zhao, Haimeng, Carleo, Giuseppe, Vicentini, Filippo
Quantum state reconstruction using Neural Quantum States has been proposed as a viable tool to reduce quantum shot complexity in practical applications, and its advantage over competing techniques has been shown in numerical experiments focusing mainly on the noiseless case. In this work, we numerically investigate the performance of different quantum state reconstruction techniques for mixed states: the finite-temperature Ising model. We show how to systematically reduce the quantum resource requirement of the algorithms by applying variance reduction techniques. Then, we compare the two leading neural quantum state encodings of the state, namely, the Neural Density Operator and the positive operator-valued measurement representation, and illustrate their different performance as the mixedness of the target state varies. We find that certain encodings are more efficient in different regimes of mixedness and point out the need for designing more efficient encodings in terms of both classical and quantum resources.
From Tensor Network Quantum States to Tensorial Recurrent Neural Networks
Wu, Dian, Rossi, Riccardo, Vicentini, Filippo, Carleo, Giuseppe
Considering the relation between neural networks (NN) and TN, the first works focused on the restricted Boltzmann machines (RBM), which are one of the simplest Tensor networks (TN) have been extensively used to classes of NN. It is impossible to efficiently map an represent the states of quantum many-body physical systems RBM onto a TN, as they correspond to string-bond states [1-3]. Matrix product states (MPS) are possibly with an arbitrary nonlocal geometry [28]. This result was the simplest family of TN, and are suitable to capture later refined to show that an RBM may correspond to an the ground state of 1D gapped Hamiltonians [4, 5]. They MPS with an exponentially large bond dimension, and can be contracted in polynomial time to compute physical only short-range RBM can be mapped onto efficiently quantities exactly, and optimized by density matrix computable entangled plaquette states [31]. Similar results renormalization group (DMRG) [6] when used as variational have been obtained that deep Boltzmann machines ansรคtze. More powerful TN architectures that with proper constraints can be mapped onto TN that cannot be efficiently contracted in general have been are efficiently computable through transfer matrix methods proposed later, notably projected entangled pair states [32].
Ab-initio quantum chemistry with neural-network wavefunctions
Hermann, Jan, Spencer, James, Choo, Kenny, Mezzacapo, Antonio, Foulkes, W. M. C., Pfau, David, Carleo, Giuseppe, Noรฉ, Frank
Machine learning and specifically deep-learning methods have outperformed human capabilities in many pattern recognition and data processing problems, in game playing, and now also play an increasingly important role in scientific discovery. A key application of machine learning in the molecular sciences is to learn potential energy surfaces or force fields from ab-initio solutions of the electronic Schr\"odinger equation using datasets obtained with density functional theory, coupled cluster, or other quantum chemistry methods. Here we review a recent and complementary approach: using machine learning to aid the direct solution of quantum chemistry problems from first principles. Specifically, we focus on quantum Monte Carlo (QMC) methods that use neural network ansatz functions in order to solve the electronic Schr\"odinger equation, both in first and second quantization, computing ground and excited states, and generalizing over multiple nuclear configurations. Compared to existing quantum chemistry methods, these new deep QMC methods have the potential to generate highly accurate solutions of the Schr\"odinger equation at relatively modest computational cost.
Neural tensor contractions and the expressive power of deep neural quantum states
Sharir, Or, Shashua, Amnon, Carleo, Giuseppe
We establish a direct connection between general tensor networks and deep feed-forward artificial neural networks. The core of our results is the construction of neural-network layers that efficiently perform tensor contractions, and that use commonly adopted non-linear activation functions. The resulting deep networks feature a number of edges that closely matches the contraction complexity of the tensor networks to be approximated. In the context of many-body quantum states, this result establishes that neural-network states have strictly the same or higher expressive power than practically usable variational tensor networks. As an example, we show that all matrix product states can be efficiently written as neural-network states with a number of edges polynomial in the bond dimension and depth logarithmic in the system size. The opposite instead does not hold true, and our results imply that there exist quantum states that are not efficiently expressible in terms of matrix product states or PEPS, but that are instead efficiently expressible with neural network states.
Unbiased Monte Carlo Cluster Updates with Autoregressive Neural Networks
Wu, Dian, Rossi, Riccardo, Carleo, Giuseppe
Efficient sampling of complex high-dimensional probability densities is a central task in computational science. Machine Learning techniques based on autoregressive neural networks have been recently shown to provide good approximations of probability distributions of interest in physics. In this work, we propose a systematic way to remove the intrinsic bias associated with these variational approximations, combining it with Markov-chain Monte Carlo in an automatic scheme to efficiently generate cluster updates, which is particularly useful for models for which no efficient cluster update scheme is known. Our approach is based on symmetry-enforced cluster updates building on the neural-network representation of conditional probabilities. We demonstrate that such finite-cluster updates are crucial to circumvent ergodicity problems associated with global neural updates. We test our method for first- and second-order phase transitions in classical spin systems, proving in particular its viability for critical systems, or in the presence of metastable states.
Quantum Natural Gradient
Stokes, James, Izaac, Josh, Killoran, Nathan, Carleo, Giuseppe
Variational optimization of parametrized quantum circuits is an integral component for many hybrid quantum-classical algorithms, which are arguably the most promising applications of Noisy Intermediate-Scale Quantum (NISQ) computers [1]. Applications include the Variational Quantum Eigensolver (VQE) [2], Quantum Approximate Optimization Algorithm (QAOA) [3] and Quantum Neural Networks (QNNs) [4-6]. All the above are examples of stochastic optimization problems whereby one minimizes the expected value of a random cost function over a set of variational parameters, using noisy estimates of the cost and/or its gradient. In the quantum setting these estimates are obtained by repeated measurements of some Hermitian observables for a quantum state which depends on the variational parameters. A variety of optimization methods have been proposed in the variational quantum circuit literature for determining optimal variational parameters, including derivative-free (zeroth-order) methods such as Nelder-Mead, finite-differencing [7] or SPSA [8].
Learning hard quantum distributions with variational autoencoders
Rocchetto, Andrea, Grant, Edward, Strelchuk, Sergii, Carleo, Giuseppe, Severini, Simone
Studying general quantum many-body systems is one of the major challenges in modern physics because it requires an amount of computational resources that scales exponentially with the size of the system.Simulating the evolution of a state, or even storing its description, rapidly becomes intractable for exact classical algorithms. Recently, machine learning techniques, in the form of restricted Boltzmann machines, have been proposed as a way to efficiently represent certain quantum states with applications in state tomography and ground state estimation. Here, we introduce a new representation of states based on variational autoencoders. Variational autoencoders are a type of generative model in the form of a neural network. We probe the power of this representation by encoding probability distributions associated with states from different classes. Our simulations show that deep networks give a better representation for states that are hard to sample from, while providing no benefit for random states. This suggests that the probability distributions associated to hard quantum states might have a compositional structure that can be exploited by layered neural networks. Specifically, we consider the learnability of a class of quantum states introduced by Fefferman and Umans. Such states are provably hard to sample for classical computers, but not for quantum ones, under plausible computational complexity assumptions. The good level of compression achieved for hard states suggests these methods can be suitable for characterising states of the size expected in first generation quantum hardware.