Goto

Collaborating Authors

IBM finally proves that quantum systems are faster than classicals

Engadget

In 1994, MIT professor of applied mathematics, Peter Shor, developed a groundbreaking quantum computing algorithm capable of factoring numbers (that is, finding the prime numbers for any integer N) using quantum computer technology. For the next decade, this algorithm provided a tantalizing glimpse at the potential prowess of quantum computing versus classical systems. However researchers could never definitively prove that quantum would always be faster in this application or whether classical systems could overtake quantum if given a sufficiently robust algorithm of its own. In a paper published Thursday in the journal Science, Dr. Sergey Bravyi and his team reveal that they've developed a mathematical proof which, in specific cases, illustrates the quantum algorithm's inherent computational advantages over classical. "It's good to know, because results like this become parts of algorithms," Bob Sutor, vice president of IBM Q Strategy and Ecosystem, told Engadget.


IBM's quantum processor comes out of hiding

PCWorld

A quantum computer for the people isn't just a theoretical dream; IBM is trying to make it a reality. IBM has built a quantum processor with five qubits, or quantum bits. Even better, IBM isn't hiding the quantum processor in its labs -- it will be accessible through the cloud for the public to run experiments and test applications. The goal is to unwrap decades-old mysteries around quantum computers and let people play with the hardware, said Jay Gambetta, manager of quantum computing theory and information at IBM. IBM's qubit processor is significant because it'll be the first quantum hardware accessible to the public, even if only through the cloud.


Quantum Computing Explained

#artificialintelligence

In 1982, a scientific paper - 'Simulating Physics with Computers' written by the famous physicist Richard P. Feynman was published.


Confirmed, finally: D-Wave quantum computer is sometimes sluggish

AITopics Original Links

D-Wave Systems, the leading manufacturer of the world's first commercially available quantum computers, is the most well funded and far along player in the quantum chip race, but hasn't yet succeeded in convincing scientists that its machines are successfully achieving quantum speedup. In other words, we're not sure that its product is speedier than traditional, silicon-based machines. In fact, in certain situations, the $15 million D-Wave Two is still no faster than the computer on your desk right now. A research team at the Swiss Federal Institute of Technology in Zurich reports that there is still a lack of definitive evidence that the D-Wave Two can perform functions any faster than traditional machines. The results of the test were published in the journal Science Thursday, though the work of head physicist Matthias Troyer has been widely circulated since January because the paper was available in pre-print.


Quantum Computational Intelligence

#artificialintelligence

Imagine solving mathematical problems where you could use the full physical range of computational possibilities within the laws of the universe, and be inspired by the sublime algorithmic intelligence of the human brain. This is precisely why the emerging field of quantum machine learning (QML) has received so much recent attention. In this blog post, we'd like to discuss the fundamental ideas and applied value of machine learning to computation in general, and then contextualize these ideas in a new way within the paradigm of quantum computation. Machine learning – a subfield of computer science related to computational statistics and pattern recognition – emerged in its modern incarnation in the mid-late 20th century as researchers attempted to build thinking machines. While first-generation artificial intelligence took inspiration from the computers of the 1980s to reason about intelligence and view humans like deterministic, syntactical machines, contemporary artificial intelligence instead chooses to build machines that have the adaptability and variability of human in "coping" with the ill-defined problem of being an individual with incomplete information in a complex world.