Rocchetto, Andrea
Fast quantum learning with statistical guarantees
Ciliberto, Carlo, Rocchetto, Andrea, Rudi, Alessandro, Wossnig, Leonard
A wide class of quantum algorithms for learning problems exp loit fast quantum linear algebra subroutines to achieve runtimes that are exponentially faster than their classical counterparts [ Cil 18 ]. Examples of these algorithms are quantum support vector m achines [ RML14 ], quantum linear regression [ WBL12; SSP16 ], and quantum least squares [ KP17; CGJ18 ]. A careful analysis of these algorithms identified a number of caveats that limit their practical applicability such as the need for a strong form of quantum ac cess to the input data, restrictions on structural properties of the data matrix (such as conditi on number or sparsity), and modes of access to the output [ Aar15 ]. Furthermore, if one assumes that it is efficient to (classic ally) sample elements of the training data in a way proportional to their norm, then it is possible to show that classical algorithms are only polynomially slowe r (albeit the scaling of the quantum algorithms can be considerably better) [ Tan18; CL W18; Chi 19a; GLT18; Chi 19b ]. In this work we continue to investigate the limitations of qu antum algorithms for learning problems.
Approximating Hamiltonian dynamics with the Nystr\"om method
Rudi, Alessandro, Wossnig, Leonard, Ciliberto, Carlo, Rocchetto, Andrea, Pontil, Massimiliano, Severini, Simone
Simulating the time-evolution of quantum mechanical systems is BQP-hard and expected to be one of the foremost applications of quantum computers. We consider the approximation of Hamiltonian dynamics using subsampling methods from randomized numerical linear algebra. We propose conditions for the efficient approximation of state vectors evolving under a given Hamiltonian. As an immediate application, we show that sample based quantum simulation, a type of evolution where the Hamiltonian is a density matrix, can be efficiently classically simulated under specific structural conditions. Our main technical contribution is a randomized algorithm for approximating Hermitian matrix exponentials. The proof leverages the Nystr\"om method to obtain low-rank approximations of the Hamiltonian. We envisage that techniques from randomized linear algebra will bring further insights into the power of quantum computation.
Quantum machine learning: a classical perspective
Ciliberto, Carlo, Herbster, Mark, Ialongo, Alessandro Davide, Pontil, Massimiliano, Rocchetto, Andrea, Severini, Simone, Wossnig, Leonard
Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning techniques to impressive results in regression, classification, data-generation and reinforcement learning tasks. Despite these successes, the proximity to the physical limits of chip fabrication alongside the increasing size of datasets are motivating a growing number of researchers to explore the possibility of harnessing the power of quantum computation to speed-up classical machine learning algorithms. Here we review the literature in quantum machine learning and discuss perspectives for a mixed readership of classical machine learning and quantum computation experts. Particular emphasis will be placed on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems. Learning in the presence of noise and certain computationally hard problems in machine learning are identified as promising directions for the field. Practical questions, like how to upload classical data into quantum form, will also be addressed.
Learning hard quantum distributions with variational autoencoders
Rocchetto, Andrea, Grant, Edward, Strelchuk, Sergii, Carleo, Giuseppe, Severini, Simone
Studying general quantum many-body systems is one of the major challenges in modern physics because it requires an amount of computational resources that scales exponentially with the size of the system.Simulating the evolution of a state, or even storing its description, rapidly becomes intractable for exact classical algorithms. Recently, machine learning techniques, in the form of restricted Boltzmann machines, have been proposed as a way to efficiently represent certain quantum states with applications in state tomography and ground state estimation. Here, we introduce a new representation of states based on variational autoencoders. Variational autoencoders are a type of generative model in the form of a neural network. We probe the power of this representation by encoding probability distributions associated with states from different classes. Our simulations show that deep networks give a better representation for states that are hard to sample from, while providing no benefit for random states. This suggests that the probability distributions associated to hard quantum states might have a compositional structure that can be exploited by layered neural networks. Specifically, we consider the learnability of a class of quantum states introduced by Fefferman and Umans. Such states are provably hard to sample for classical computers, but not for quantum ones, under plausible computational complexity assumptions. The good level of compression achieved for hard states suggests these methods can be suitable for characterising states of the size expected in first generation quantum hardware.