Biamonte, Jacob, Wittek, Peter, Pancotti, Nicola, Rebentrost, Patrick, Wiebe, Nathan, Lloyd, Seth

Recent progress implies that a crossover between machine learning and quantum information processing benefits both fields. Traditional machine learning has dramatically improved the benchmarking and control of experimental quantum computing systems, including adaptive quantum phase estimation and designing quantum computing gates. On the other hand, quantum mechanics offers tantalizing prospects to enhance machine learning, ranging from reduced computational complexity to improved generalization performance. The most notable examples include quantum enhanced algorithms for principal component analysis, quantum support vector machines, and quantum Boltzmann machines. Progress has been rapid, fostered by demonstrations of midsized quantum optimizers which are predicted to soon outperform their classical counterparts. Further, we are witnessing the emergence of a physical theory pinpointing the fundamental and natural limitations of learning. Here we survey the cutting edge of this merger and list several open problems.

In this talk I will discuss some of the long-term challenges emerging with the effort of making deep learning a relevant tool for controlled scientific discovery in many-body quantum physics. The current state of the art of deep neural quantum states and learning tools will be discussed in connection with open challenging problems in condensed matter physics, including frustrated magnetism and quantum dynamics. Variational algorithms for a gate-based quantum computer, like the QAOA, prescribe a fixed circuit ansatz --- up to a set of continuous parameters --- that is designed to find a low-energy state of a given target Hamiltonian. After reviewing the relevant aspects of the QAOA, I will describe attempts to make the algorithm more efficient. The strategies I will explore are 1) tuning the variational objective function away from the energy expectation value, 2) analytical estimates that allow elimination of some of the gates in the QAOA circuit, and 3) using methods of machine learning to search the design space of nearby circuits for improvements to the original ansatz.

Rocchetto, Andrea, Grant, Edward, Strelchuk, Sergii, Carleo, Giuseppe, Severini, Simone

Studying general quantum many-body systems is one of the major challenges in modern physics because it requires an amount of computational resources that scales exponentially with the size of the system.Simulating the evolution of a state, or even storing its description, rapidly becomes intractable for exact classical algorithms. Recently, machine learning techniques, in the form of restricted Boltzmann machines, have been proposed as a way to efficiently represent certain quantum states with applications in state tomography and ground state estimation. Here, we introduce a new representation of states based on variational autoencoders. Variational autoencoders are a type of generative model in the form of a neural network. We probe the power of this representation by encoding probability distributions associated with states from different classes. Our simulations show that deep networks give a better representation for states that are hard to sample from, while providing no benefit for random states. This suggests that the probability distributions associated to hard quantum states might have a compositional structure that can be exploited by layered neural networks. Specifically, we consider the learnability of a class of quantum states introduced by Fefferman and Umans. Such states are provably hard to sample for classical computers, but not for quantum ones, under plausible computational complexity assumptions. The good level of compression achieved for hard states suggests these methods can be suitable for characterising states of the size expected in first generation quantum hardware.

Ciliberto, Carlo, Herbster, Mark, Ialongo, Alessandro Davide, Pontil, Massimiliano, Rocchetto, Andrea, Severini, Simone, Wossnig, Leonard

Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning techniques to impressive results in regression, classification, data-generation and reinforcement learning tasks. Despite these successes, the proximity to the physical limits of chip fabrication alongside the increasing size of datasets are motivating a growing number of researchers to explore the possibility of harnessing the power of quantum computation to speed-up classical machine learning algorithms. Here we review the literature in quantum machine learning and discuss perspectives for a mixed readership of classical machine learning and quantum computation experts. Particular emphasis will be placed on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems. Learning in the presence of noise and certain computationally hard problems in machine learning are identified as promising directions for the field. Practical questions, like how to upload classical data into quantum form, will also be addressed.

Dendukuri, Aditya, Keeling, Blake, Fereidouni, Arash, Burbridge, Joshua, Luu, Khoa, Churchill, Hugh

This work presents a novel fundamental algorithm for for defining and training Neural Networks in Quantum Information based on time evolution and the Hamiltonian. Classical Neural Network algorithms (ANN) are computationally expensive. For example, in image classification, representing an image pixel by pixel using classical information requires an enormous amount of computational memory resources. Hence, exploring methods to represent images in a different paradigm of information is important. Quantum Neural Networks (QNNs) have been explored for over 20 years. The current forefront work based on Variational Quantum Circuits is specifically defined for the Continuous Variable (CV) Model of quantum computers. In this work, a model is proposed which is defined at a more fundamental level and hence can be inherited by any variants of quantum computing models. This work also presents a quantum backpropagation algorithm to train our QNN model and validate this algorithm on the MNIST dataset on a quantum computer simulation.