Goto

Collaborating Authors

Roadmap for 1000 Qubits Fault-tolerant Quantum Computers - Amit Ray

#artificialintelligence

How many qubits are needed to out-perform conventional computers, how to protect a quantum computer from the effects of decoherence and how to design more than 1000 qubits fault-tolerant large scale quantum computers, these are the three basic questions we want to deal in this article. Qubit technologies, qubit quality, qubit count, qubit connectivity and qubit architectures are the five key areas of quantum computing are discussed. Earlier we have discussed 7 Core Qubit Technologies for Quantum Computing, 7 Key Requirements for Quantum Computing. Spin-orbit Coupling Qubits for Quantum Computing and AI, Quantum Computing Algorithms for Artificial Intelligence, Quantum Computing and Artificial Intelligence, Quantum Computing with Many World Interpretation Scopes and Challenges and Quantum Computer with Superconductivity at Room Temperature. Here, we will focus on practical issues related to designing large-scale quantum computers.


Space and Time Could Be a Quantum Error-Correcting Code

WIRED

In 1994, a mathematician at AT&T Research named Peter Shor brought instant fame to "quantum computers" when he discovered that these hypothetical devices could quickly factor large numbers -- and thus break much of modern cryptography. But a fundamental problem stood in the way of actually building quantum computers: the innate frailty of their physical components. Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences. Unlike binary bits of information in ordinary computers, "qubits" consist of quantum particles that have some probability of being in each of two states, designated 0⟩ and 1⟩, at the same time. When qubits interact, their possible states become interdependent, each one's chances of 0⟩ and 1⟩ hinging on those of the other.


The Era of Quantum Computing Is Here. Outlook: Cloudy

WIRED

After decades of heavy slog with no promise of success, quantum computing is suddenly buzzing with almost feverish excitement and activity. Nearly two years ago, IBM made a quantum computer available to the world: the 5-quantum-bit (qubit) resource they now call (a little awkwardly) the IBM Q experience. That seemed more like a toy for researchers than a way of getting any serious number crunching done. But 70,000 users worldwide have registered for it, and the qubit count in this resource has now quadrupled. In the past few months, IBM and Intel have announced that they have made quantum computers with 50 and 49 qubits, respectively, and Google is thought to have one waiting in the wings. "There is a lot of energy in the community, and the recent progress is immense," said physicist Jens Eisert of the Free University of Berlin.


Constant Overhead Quantum Fault Tolerance with Quantum Expander Codes

Communications of the ACM

The threshold theorem is a seminal result in the field of quantum computing asserting that arbitrarily long quantum computations can be performed on a faulty quantum computer provided that the noise level is below some constant threshold. This remarkable result comes at the price of increasing the number of qubits (quantum bits) by a large factor that scales polylogarithmically with the size of the quantum computation we wish to realize. Minimizing the space overhead for fault-tolerant quantum computation is a pressing challenge that is crucial to benefit from the computational potential of quantum devices. In this paper, we study the asymptotic scaling of the space overhead needed for fault-tolerant quantum computation. We show that the polylogarithmic factor in the standard threshold theorem is in fact not needed and that there is a fault-tolerant construction that uses a number of qubits that is only a constant factor more than the number of qubits of the ideal computation. This result was conjectured by Gottesman who suggested to replace the concatenated codes from the standard threshold theorem by quantum error-correcting codes with a constant encoding rate. The main challenge was then to find an appropriate family of quantum codes together with an efficient classical decoding algorithm working even with a noisy syndrome. The efficiency constraint is crucial here: bear in mind that qubits are inherently noisy and that faults keep accumulating during the decoding process. The role of the decoder is therefore to keep the number of errors under control during the whole computation. On a technical level, our main contribution is the analysis of the SMALL-SET-FLIP decoding algorithm applied to the family of quantum expander codes. We show that it can be parallelized to run in constant time while correcting sufficiently many errors on both the qubits and the syndrome to keep the error under control. These tools can be seen as a quantum generalization of the BIT-FLIP algorithm applied to the (classical) expander codes of Sipser and Spielman. Quantum computers are expected to offer significant, sometimes exponential, speedups compared to classical computers.


Google: We'll build this 'useful' quantum computer by the end of the decade

ZDNet

Google has unveiled its new Quantum AI campus in Santa Barbara, California, where engineers and scientists will be working on its first commercial quantum computer – but that will probably be a decade way. The new campus has a focus on both software and hardware. On the latter front, these include its first quantum data center, quantum hardware research labs, and Google's own quantum processor chip fabrication facilities, says Erik Lucero, lead engineer for Google Quantum AI in a blogpost. Quantum computers offer great promise for cryptography and optimization problems. ZDNet explores what quantum computers will and won't be able to do, and the challenges we still face.