Goto

Collaborating Authors

Quantum Hype and Quantum Skepticism

Communications of the ACM

The first third of the 20th century saw the collapse of many absolutes. Albert Einstein's 1905 special relativity theory eliminated the notion of absolute time, while Kurt Gödel's 1931 incompleteness theorem questioned the notion of absolute mathematical truth. Most profoundly, however, quantum mechanics raised doubts on the notion of absolute objective reality. Is Schrödinger's cat dead or alive? Nearly 100 years after quantum mechanics was introduced, scientists still are not in full agreement on what it means.


Google Accelerates Quantum Computation with Classical Machine Learning

#artificialintelligence

Tech giant Google's recent claim regarding quantum supremacy created a buzz in the computer science community and got global mainstream media talking about quantum computing breakthroughs. Yesterday Google fed the public's growing interest in the topic with a blog post introducing a study on improving quantum computation using classical machine learning. The qubit is the most basic constituent of quantum computing, and also poses one of the most significant challenges for the realization of near-term quantum computers. Various characteristics of qubits have made it challenging to control them. Google AI explains that issues such as imperfections in the control electronics can "impact the fidelity of the computation and thus limit the applications of near-term quantum devices."


Technical Perspective: Deciphering Errors to Reduce the Cost of Quantum Computation

Communications of the ACM

Quantum computers may one day upend cryptography, help design new materials and drugs, and accelerate many other computational tasks. A quantum computer's memory is a quantum system, capable of being in a superposition of many different bit strings at once. It can take advantage of quantum interference to run uniquely quantum algorithms which can solve some (but not all) computational problems much faster than a regular classical computer. Experimental efforts to build a quantum computer have taken enormous strides forward in the last decade, leading to today's devices with over 50 quantum bits ("qubits"). Governments and large technology companies such as Google, IBM, and Microsoft, as well as a slew of start-ups, have begun pouring money into the field hoping to be the first with a useful quantum computer.


The Era of Quantum Computing Is Here. Outlook: Cloudy

WIRED

After decades of heavy slog with no promise of success, quantum computing is suddenly buzzing with almost feverish excitement and activity. Nearly two years ago, IBM made a quantum computer available to the world: the 5-quantum-bit (qubit) resource they now call (a little awkwardly) the IBM Q experience. That seemed more like a toy for researchers than a way of getting any serious number crunching done. But 70,000 users worldwide have registered for it, and the qubit count in this resource has now quadrupled. In the past few months, IBM and Intel have announced that they have made quantum computers with 50 and 49 qubits, respectively, and Google is thought to have one waiting in the wings. "There is a lot of energy in the community, and the recent progress is immense," said physicist Jens Eisert of the Free University of Berlin.


Fault-tolerant detection of a quantum error

Science

A critical component of any quantum error–correcting scheme is detection of errors by using an ancilla system. We demonstrate a fault-tolerant error-detection scheme that suppresses spreading of ancilla errors by a factor of 5, while maintaining the assignment fidelity. The same method is used to prevent propagation of ancilla excitations, increasing the logical qubit dephasing time by an order of magnitude. Our approach is hardware-efficient, as it uses a single multilevel transmon ancilla and a cavity-encoded logical qubit, whose interaction is engineered in situ by using an off-resonant sideband drive. The results demonstrate that hardware-efficient approaches that exploit system-specific error models can yield advances toward fault-tolerant quantum computation.