Goto

Collaborating Authors

Machine Learning for Quantum Design

#artificialintelligence

In this talk I will discuss some of the long-term challenges emerging with the effort of making deep learning a relevant tool for controlled scientific discovery in many-body quantum physics. The current state of the art of deep neural quantum states and learning tools will be discussed in connection with open challenging problems in condensed matter physics, including frustrated magnetism and quantum dynamics. Variational algorithms for a gate-based quantum computer, like the QAOA, prescribe a fixed circuit ansatz --- up to a set of continuous parameters --- that is designed to find a low-energy state of a given target Hamiltonian. After reviewing the relevant aspects of the QAOA, I will describe attempts to make the algorithm more efficient. The strategies I will explore are 1) tuning the variational objective function away from the energy expectation value, 2) analytical estimates that allow elimination of some of the gates in the QAOA circuit, and 3) using methods of machine learning to search the design space of nearby circuits for improvements to the original ansatz.


Improving quantum computation with classical machine learning

#artificialintelligence

Quantum computers aren't constrained to two states; they encode data as quantum bits, or qubits, which can exist in superposition. Qubits represent, particles, photons or electrons, and their respective control devices that are working together to act as computer memory and a processor. Qubits can interact with anything nearby that carries energy close to their own, for example, photons, phonons, or quantum defects, which can change the state of the qubits themselves. Manipulating and controlling out qubits is performed through old-style controls: pure signal as electromagnetic fields coupled to a physical substrate in which the qubit is implanted, e.g., superconducting circuits. Defects in these control electronics, from external sources of radiation, and variances in digital-to-analog converters, introduce even more stochastic errors that degrade the performance of quantum circuits.


Improving quantum computation with classical machine learning

#artificialintelligence

Quantum computers aren't constrained to two states; they encode data as quantum bits, or qubits, which can exist in superposition. Qubits represent, particles, photons or electrons, and their respective control devices that are working together to act as computer memory and a processor. Qubits can interact with anything nearby that carries energy close to their own, for example, photons, phonons, or quantum defects, which can change the state of the qubits themselves. Manipulating and controlling out qubits is performed through old-style controls: pure signal as electromagnetic fields coupled to a physical substrate in which the qubit is implanted, e.g., superconducting circuits. Defects in these control electronics, from external sources of radiation, and variances in digital-to-analog converters, introduce even more stochastic errors that degrade the performance of quantum circuits.


Improving Quantum Computation with Classical Machine Learning

#artificialintelligence

One of the primary challenges for the realization of near-term quantum computers has to do with their most basic constituent: the qubit. Qubits can interact with anything in close proximity that carries energy close to their own--stray photons (i.e., unwanted electromagnetic fields), phonons (mechanical oscillations of the quantum device), or quantum defects (irregularities in the substrate of the chip formed during manufacturing)--which can unpredictably change the state of the qubits themselves. Further complicating matters, there are numerous challenges posed by the tools used to control qubits. Manipulating and reading out qubits is performed via classical controls: analog signals in the form of electromagnetic fields coupled to a physical substrate in which the qubit is embedded, e.g., superconducting circuits. Imperfections in these control electronics (giving rise to white noise), interference from external sources of radiation, and fluctuations in digital-to-analog converters, introduce even more stochastic errors that degrade the performance of quantum circuits.


QC Ware Races Ahead With Breakthrough in Quantum Machine Learning Algorithms

#artificialintelligence

QC Ware, the leader in enterprise software and services for quantum computing, today announced a significant breakthrough in quantum machine learning (QML) that increases QML accuracy and speeds up the industry timeline for practical QML applications on near-term quantum computers. QC Ware's algorithms researchers have discovered how classical data can be loaded onto quantum hardware efficiently and how distance estimations can be performed quantumly. These new capabilities enabled by Data Loaders are now available in the latest release of QC Ware's Forge cloud services platform, an integrated environment to build, edit, and implement quantum algorithms on quantum hardware and simulators. "QC Ware estimates that with Forge Data Loaders, the industry's 10-to-15-year timeline for practical applications of QML will be reduced significantly," said Yianni Gamvros, Head of Product and Business Development at QC Ware. "What our algorithms team has achieved for the quantum computing industry is equivalent to a quantum hardware manufacturer introducing a chip that is 10 to 100 times faster than their previous offering. This exciting development will require business analysts to update their quad charts and innovation scouts to adjust their technology timelines."