Quantum computers hold much promise to revolutionize the computing power of machines, leaving behind the classical computing technology when it comes to solving some complex problems current computers are ill-equipped to deal with. But current quantum technology is still below the threshold of the most powerful supercomputers in the world today. Called quantum supremacy, this threshold is thought to be breached somewhere in the range of 50 qubits -- quantum analogs of the classic computer bits. And the most advanced quantum computers built today are well below 20 qubits, such as the IBM machine announced in May that runs on 17 qubits. However, that may be set to change after Harvard University's Mikhail Lukin announced at the recently concluded 4th International Conference on Quantum Technologies (ICQT) in Moscow that his team had successfully built and tested a 51-qubit quantum computer.
Intel Labs unveiled a first-of-its-kind cryogenic control chip -- code-named "Horse Ridge" -- that will speed up development of full-stack quantum computing systems. Horse Ridge will enable control of multiple quantum bits (qubits) and set a clear path toward scaling larger systems -- a major milestone on the path to quantum practicality. Developed together with Intel's research collaborators at QuTech, a partnership between TU Delft and TNO (Netherlands Organization for Applied Scientific Research), Horse Ridge is fabricated using Intel's 22nm FinFET Low Power (22FFL) technology. In-house fabrication of these control chips at Intel will dramatically accelerate the company's ability to design, test and optimize a commercially viable quantum computer. Jim Clarke, Intel's director of quantum hardware, says this integration is possible because of the kind of qubits the company uses.
Here we discussed the advantages and limitations of seven key qubit technologies for designing efficient quantum computing systems. The seven qubit types are: Superconducting qubits, Quantum dots qubits, Trapped Ion Qubits, Photonic qubits, Defect-based qubits, Topological Qubits, and Nuclear Magnetic Resonance (NMR) . They are the seven pathways for designing effective quantum computing systems. Each one of them have their own limitations and advantages. We have also discussed the hierarchies of qubit types.
How many qubits are needed to out-perform conventional computers, how to protect a quantum computer from the effects of decoherence and how to design more than 1000 qubits fault-tolerant large scale quantum computers, these are the three basic questions we want to deal in this article. Qubit technologies, qubit quality, qubit count, qubit connectivity and qubit architectures are the five key areas of quantum computing are discussed. Earlier we have discussed 7 Core Qubit Technologies for Quantum Computing, 7 Key Requirements for Quantum Computing. Spin-orbit Coupling Qubits for Quantum Computing and AI, Quantum Computing Algorithms for Artificial Intelligence, Quantum Computing and Artificial Intelligence, Quantum Computing with Many World Interpretation Scopes and Challenges and Quantum Computer with Superconductivity at Room Temperature. Here, we will focus on practical issues related to designing large-scale quantum computers.
The race to create superfast computers is accelerating. A rethink of one of the most fundamental parts of a quantum computer could pave the way for ultra-powerful devices. Andrea Morello at the University of New South Wales in Australia and his colleagues have a design for a qubit – the smallest unit of quantum information – that could help get round some of the difficulties of manufacturing quantum computers at an atomic scale. At the moment, making quantum systems using silicon is difficult because the qubits have to be very close to each other, about 10 to 20 nanometres apart, in order to communicate. This leaves little room to place the electronics needed to make a quantum computer work.