Collaborating Authors


Learning the noise fingerprint of quantum devices Artificial Intelligence

In the quantum technologies context, no quantum device can be considered an isolated (ideal) quantum system. For this reason, the acronym Noisy Intermediate-Scale Quantum (NISQ) technology has been recently introduced [1] to identify the class of early devices in which noise in quantum gates dramatically limits the size of circuits and algorithms that can be reliably performed [2, 3]. As early quantum devices become more widespread, a question that naturally arises is to understand, at the experimental level, whether in a generic quantum device the signature left by inner noise processes exhibits universal features or is characteristic of the specific quantum platform. Moreover, one may wonder to determine if such a noise signature has a time-dependent profile or can be effectively considered stable, in the sense of constant over time, while the device is operating. The answers to these questions are expected to be crucial in defining a proper strategy to mitigate the influence of noise and systematic errors [4-8], possibly going beyond standard quantum sensing techniques [9-14] and overcoming current limitations on probes dimension and resolution [9, 10, 15-18].

Representation of binary classification trees with binary features by quantum circuits Machine Learning

We propose a quantum representation of binary classification trees with binary features based on a probabilistic approach. By using the quantum computer as a processor for probability distributions, a probabilistic traversal of the decision tree can be realized via measurements of a quantum circuit. We describe how tree inductions and the prediction of class labels of query data can be integrated into this framework. An on-demand sampling method enables predictions with a constant number of classical memory slots, independent of the tree depth. We experimentally study our approach using both a quantum computing simulator and actual IBM quantum hardware. To our knowledge, this is the first realization of a decision tree classifier on a quantum device.

The next quantum race: Who can harness it first?


With Japan's first commercial quantum computer going into operation last month, more global competitors are entering the race to gain an advantage by mastering the next-generation technology, with Germany emerging as a strong contender. In Kawasaki, a city in the Tokyo metropolitan area, sits a commercial quantum computer made by IBM at the Kawasaki Business Incubation Center. Toyota Motor, Hitachi and Toshiba are among the companies that are using the device. Quantum computers are expected to break the limitations of conventional computing. In 2019, Google startled the world by using the technology to solve a problem in 3 minutes and 20 seconds that would have required 10,000 years by a conventional computer.

2021 Best Insights From Quantum Computing Top Leaders


And this overhead is relatively large, so it's estimated that you need a few 100 to 1000 physical qubits to get to one logical qubit. And then this logical qubit has a significantly suppressed error. And then you can start to work with that, in this clean theoretic computational paradigm where you ignore more or less the noise from the hardware.

Eternal Change for No Energy: A Time Crystal Finally Made Real


In a preprint posted online Thursday night, researchers at Google in collaboration with physicists at Stanford, Princeton and other universities say that they have used Google's quantum computer to demonstrate a genuine "time crystal." In addition, a separate research group claimed earlier this month to have created a time crystal in a diamond. A novel phase of matter that physicists have strived to realize for many years, a time crystal is an object whose parts move in a regular, repeating cycle, sustaining this constant change without burning any energy. "The consequence is amazing: You evade the second law of thermodynamics," said Roderich Moessner, director of the Max Planck Institute for the Physics of Complex Systems in Dresden, Germany, and a co-author on the Google paper. That's the law that says disorder always increases.

IBM and CERN want to use quantum computing to unlock the mysteries of the universe


It is likely that future quantum computers will significantly boost the understanding of CERN's gigantic particle collider. The potential of quantum computers is currently being discussed in settings ranging from banks to merchant ships, and now the technology has been taken even further afield – or rather, lower down. One hundred meters below the Franco-Swiss border sits the world's largest machine, the Large Hadron Collider (LHC) operated by the European laboratory for particle physics, CERN. And to better understand the mountains of data produced by such a colossal system, CERN's scientists have been asking IBM's quantum team for some assistance. The partnership has been successful: in a new paper, which is yet to be peer-reviewed, IBM's researchers have established that quantum algorithms can help make sense of the LHC's data, meaning that it is likely that future quantum computers will significantly boost scientific discoveries at CERN. With CERN's mission statement being to understand why anything in the universe happens at all, this could have big implications for anyone interested in all things matter, antimatter, dark matter and so on.

Multiple Query Optimization using a Hybrid Approach of Classical and Quantum Computing Artificial Intelligence

Quantum computing promises to solve difficult optimization problems in chemistry, physics and mathematics more efficiently than classical computers, but requires fault-tolerant quantum computers with millions of qubits. To overcome errors introduced by today's quantum computers, hybrid algorithms combining classical and quantum computers are used. In this paper we tackle the multiple query optimization problem (MQO) which is an important NP-hard problem in the area of data-intensive problems. We propose a novel hybrid classical-quantum algorithm to solve the MQO on a gate-based quantum computer. We perform a detailed experimental evaluation of our algorithm and compare its performance against a competing approach that employs a quantum annealer -- another type of quantum computer. Our experimental results demonstrate that our algorithm currently can only handle small problem sizes due to the limited number of qubits available on a gate-based quantum computer compared to a quantum computer based on quantum annealing. However, our algorithm shows a qubit efficiency of close to 99% which is almost a factor of 2 higher compared to the state of the art implementation. Finally, we analyze how our algorithm scales with larger problem sizes and conclude that our approach shows promising results for near-term quantum computers.

Quantum Computing With Qiskit Ultimate Masterclass


Then you arrived at the right place, this course is designed for you! Quantum Computing is the intersection of computer science, mathematics and quantum physics which utilizes the phenomena of quantum mechanics to perform computations which classical computers cannot perform. Quantum computers are faster than classical computers and provides significant speedup in different kinds of algorithms such as searching data elements or breaking RSA encryption systems! It is expected that the Quantum Computing industry is going to grow at a rapid rate from around USD 500 million in 2021 to nearly USD 1800 million (1.8 billion!) by 2026. Various industries such as banking, finance, space technology, defense, healthcare, pharmaceuticals, chemicals, energy, power, transportation, logistics, academia and government are going to do well out of this cutting-edge technology.

Microsoft Surface Laptop 4 review: Windows 10 as it is meant to be

The Guardian

Microsoft's sleek and stylish Surface Laptop is back for its fourth generation with faster performance and a greater variety of chips. The Surface Laptop 4 is available with either a 13.5in or a 15in screen and starts at £999 in the UK, $999 in the US or $1,599 in Australia sitting above the Surface Laptop Go as Microsoft's mainstream premium notebook, competing with the similarly priced Dell XPS 13 and Apple MacBook Air, among others. Very little has changed on the outside, matching the dimensions, weight, port selection and design of 2020's Surface Laptop 3. Here tested with a 13.5in screen, it still looks and feels sleek with its aluminium lid, choice of Alcantara fabric or aluminium deck and bright and crisp touch screen. The keyboard is excellent while the large trackpad is smooth and precise. The speakers are loud and clear with reasonable bass for a laptop, while the 720p webcam and microphones are better than most for video calls.

Vector Symbolic Architectures as a Computing Framework for Nanoscale Hardware Artificial Intelligence

This article reviews recent progress in the development of the computing framework Vector Symbolic Architectures (also known as Hyperdimensional Computing). This framework is well suited for implementation in stochastic, nanoscale hardware and it naturally expresses the types of cognitive operations required for Artificial Intelligence (AI). We demonstrate in this article that the ring-like algebraic structure of Vector Symbolic Architectures offers simple but powerful operations on high-dimensional vectors that can support all data structures and manipulations relevant in modern computing. In addition, we illustrate the distinguishing feature of Vector Symbolic Architectures, "computing in superposition," which sets it apart from conventional computing. This latter property opens the door to efficient solutions to the difficult combinatorial search problems inherent in AI applications. Vector Symbolic Architectures are Turing complete, as we show, and we see them acting as a framework for computing with distributed representations in myriad AI settings. This paper serves as a reference for computer architects by illustrating techniques and philosophy of VSAs for distributed computing and relevance to emerging computing hardware, such as neuromorphic computing.