Machine learning techniques have so far proved to be very promising for the analysis of data in several fields, with many potential applications. However, researchers have found that applying these methods to quantum physics problems is far more challenging due to the exponential complexity of many-body systems. Quantum many-body systems are essentially microscopic structures made up of several interacting particles. While quantum physics studies have focused on the collective behavior of these systems, using machine learning in these investigations has proven to be very difficult. With this in mind, a team of researchers at Harvard University recently developed a quantum circuit-based algorithm inspired by convolutional neural networks (CNNs), a popular machine learning technique that has achieved remarkable results in a variety of fields.
The resemblance between the methods used in studying quantum-many body physics and in machine learning has drawn considerable attention. In particular, tensor networks (TNs) and deep learning architectures bear striking similarities to the extent that TNs can be used for machine learning. Previous results used one-dimensional TNs in image recognition, showing limited scalability and a high bond dimension. In this work, we train two-dimensional hierarchical TNs to solve image recognition problems, using a training algorithm derived from the multipartite entanglement renormalization ansatz (MERA). This approach overcomes scalability issues and implies novel mathematical connections among quantum many-body physics, quantum information theory, and machine learning. While keeping the TN unitary in the training phase, TN states can be defined, which optimally encodes each class of the images into a quantum many-body state. We study the quantum features of the TN states, including quantum entanglement and fidelity. We suggest these quantities could be novel properties that characterize the image classes, as well as the machine learning tasks. Our work could be further applied to identifying possible quantum properties of certain artificial intelligence methods.
Interests: Machine Learning Theoretical Physics Applied Mathematics Python Package Development 3. Tensor Networks What are they? Why are they so important? "A tensor network is a collection of tensors with indices connected according to a network pattern. It can be used to efficiently represent a many-body wave- function in an otherwise exponentially large Hilbert space." It can be represented as graph.
The great success of deep learning shows that its technology contains profound truth, and understanding its internal mechanism not only has important implications for the development of its technology and effective application in various fields, but also provides meaningful insights into the understanding of human brain mechanism. At present, most of the theoretical research on deep learning is based on mathematics. This dissertation proposes that the neural network of deep learning is a physical system, examines deep learning from three different perspectives: microscopic, macroscopic, and physical world views, answers multiple theoretical puzzles in deep learning by using physics principles. For example, from the perspective of quantum mechanics and statistical physics, this dissertation presents the calculation methods for convolution calculation, pooling, normalization, and Restricted Boltzmann Machine, as well as the selection of cost functions, explains why deep learning must be deep, what characteristics are learned in deep learning, why Convolutional Neural Networks do not have to be trained layer by layer, and the limitations of deep learning, etc., and proposes the theoretical direction and basis for the further development of deep learning now and in the future. The brilliance of physics flashes in deep learning, we try to establish the deep learning technology based on the scientific theory of physics.
The groundwork for machine learning was laid down in the middle of last century. When your bank calls to ask about a suspiciously large purchase made on your credit card at a strange time, it's unlikely that a kindly member of staff has personally been combing through your account. Instead, it's more likely that a machine has learned what sort of behaviours to associate with criminal activity – and that it's spotted something unexpected on your statement. Silently and efficiently, the bank's computer has been using algorithms to watch over your account for signs of theft. Monitoring credit cards in this way is an example of "machine learning" – the process by which a computer system, trained on a given set of examples, develops the ability to perform a task flexibly and autonomously.