Goto

Collaborating Authors

Google Accelerates Quantum Computation with Classical Machine Learning

#artificialintelligence

Tech giant Google's recent claim regarding quantum supremacy created a buzz in the computer science community and got global mainstream media talking about quantum computing breakthroughs. Yesterday Google fed the public's growing interest in the topic with a blog post introducing a study on improving quantum computation using classical machine learning. The qubit is the most basic constituent of quantum computing, and also poses one of the most significant challenges for the realization of near-term quantum computers. Various characteristics of qubits have made it challenging to control them. Google AI explains that issues such as imperfections in the control electronics can "impact the fidelity of the computation and thus limit the applications of near-term quantum devices."


Improving quantum computation with classical machine learning

#artificialintelligence

Quantum computers aren't constrained to two states; they encode data as quantum bits, or qubits, which can exist in superposition. Qubits represent, particles, photons or electrons, and their respective control devices that are working together to act as computer memory and a processor. Qubits can interact with anything nearby that carries energy close to their own, for example, photons, phonons, or quantum defects, which can change the state of the qubits themselves. Manipulating and controlling out qubits is performed through old-style controls: pure signal as electromagnetic fields coupled to a physical substrate in which the qubit is implanted, e.g., superconducting circuits. Defects in these control electronics, from external sources of radiation, and variances in digital-to-analog converters, introduce even more stochastic errors that degrade the performance of quantum circuits.


Learning in Quantum Control: High-Dimensional Global Optimization for Noisy Quantum Dynamics

arXiv.org Machine Learning

Quantum control is valuable for various quantum technologies such as high-fidelity gates for universal quantum computing, adaptive quantum-enhanced metrology, and ultra-cold atom manipulation. Although supervised machine learning and reinforcement learning are widely used for optimizing control parameters in classical systems, quantum control for parameter optimization is mainly pursued via gradient-based greedy algorithms. Although the quantum fitness landscape is often compatible with greedy algorithms, sometimes greedy algorithms yield poor results, especially for large-dimensional quantum systems. We employ differential evolution algorithms to circumvent the stagnation problem of non-convex optimization. We improve quantum control fidelity for noisy system by averaging over the objective function. To reduce computational cost, we introduce heuristics for early termination of runs and for adaptive selection of search subspaces. Our implementation is massively parallel and vectorized to reduce run time even further. We demonstrate our methods with two examples, namely quantum phase estimation and quantum gate design, for which we achieve superior fidelity and scalability than obtained using greedy algorithms.



Why Google's new quantum computer could launch an artificial intelligence arms race

AITopics Original Links

Ever since the 1980s, researchers have been working on the development of a quantum computer that would be exponentially more powerful than any of the digital computers that exist today. And now Google, in collaboration with NASA and the Universities Space Research Association (USRA), says it has a quantum computer -- the D-Wave 2X -- that actually works. Google claims the D-Wave 2X is 100 million times faster than any of today's machines. As a result, this quantum computer could theoretically complete calculations within seconds to a problem that might take a digital computer 10,000 years to calculate. That's particularly important, given the difficult tasks that today's computers are called upon to complete and the staggering amount of data they are called upon to process.