Intel offers AI breakthrough in quantum computing

ZDNet

We don't know why deep learning forms of neural networks achieve great success on many tasks; the discipline has a paucity of theory to explain its empirical successes. As Facebook's Yann LeCun has said, deep learning is like the steam engine, which preceded the underlying theory of thermodynamics by many years. But some deep thinkers have been plugging away at the matter of theory for several years now. On Wednesday, the group presented a proof of deep learning's superior ability to simulate the computations involved in quantum computing. According to these thinkers, the redundancy of information that happens in two of the most successful neural network types, convolutional neural nets, or CNNs, and recurrent neural networks, or RNNs, makes all the difference.


IBM, Intel Papers Report AI Breakthroughs for Quantum Science

#artificialintelligence

Fascinatingly, two announcements today show how AI (machine and deep learning) can influence quantum computing in quite different ways. The twin announcements closely track prestigious publications. The MIT, Oxford, and IBM-led paper, Supervised learning with quantum-enhanced feature spaces, was published in Nature today. The Intel-led paper, Quantum Entanglement in Deep Learning Architectures, was published in APS Physical Review Letters last month. Intel made its announcement in conjunction with Intel Mobileye co-founder/CEO Amnon Shashua's keynote today at the National Academy of Sciences'Science of Deep Learning' conference.


IBM, Intel Papers Report AI Breakthroughs for Quantum Science

#artificialintelligence

Fascinatingly, two announcements today show how AI (machine and deep learning) can influence quantum computing in quite different ways. The twin announcements closely track prestigious publications. The MIT, Oxford, and IBM-led paper, Supervised learning with quantum-enhanced feature spaces, was published in Nature today. The Intel-led paper, Quantum Entanglement in Deep Learning Architectures, was published in APS Physical Review Letters last month. Intel made its announcement in conjunction with Intel Mobileye co-founder/CEO Amnon Shashua's keynote today at the National Academy of Sciences'Science of Deep Learning' conference.


Machine Learning by Two-Dimensional Hierarchical Tensor Networks: A Quantum Information Theoretic Perspective on Deep Architectures

arXiv.org Machine Learning

The resemblance between the methods used in studying quantum-many body physics and in machine learning has drawn considerable attention. In particular, tensor networks (TNs) and deep learning architectures bear striking similarities to the extent that TNs can be used for machine learning. Previous results used one-dimensional TNs in image recognition, showing limited scalability and a high bond dimension. In this work, we train two-dimensional hierarchical TNs to solve image recognition problems, using a training algorithm derived from the multipartite entanglement renormalization ansatz (MERA). This approach overcomes scalability issues and implies novel mathematical connections among quantum many-body physics, quantum information theory, and machine learning. While keeping the TN unitary in the training phase, TN states can be defined, which optimally encodes each class of the images into a quantum many-body state. We study the quantum features of the TN states, including quantum entanglement and fidelity. We suggest these quantities could be novel properties that characterize the image classes, as well as the machine learning tasks. Our work could be further applied to identifying possible quantum properties of certain artificial intelligence methods.


Information Perspective to Probabilistic Modeling: Boltzmann Machines versus Born Machines

arXiv.org Machine Learning

We compare and contrast the statistical physics and quantum physics inspired approaches for unsupervised generative modeling of classical data. The two approaches represent probabilities of observed data using energy-based models and quantum states respectively.Classical and quantum information patterns of the target datasets therefore provide principled guidelines for structural design and learning in these two approaches. Taking the restricted Boltzmann machines (RBM) as an example, we analyze the information theoretical bounds of the two approaches. We verify our reasonings by comparing the performance of RBMs of various architectures on the standard MNIST datasets.