Goto

Collaborating Authors

The Ongoing Battle Between Quantum and Classical Computers

WIRED

A popular misconception is that the potential--and the limits--of quantum computing must come from hardware. In the digital age, we've gotten used to marking advances in clock speed and memory. Likewise, the 50-qubit quantum machines now coming online from the likes of Intel and IBM have inspired predictions that we are nearing "quantum supremacy"--a nebulous frontier where quantum computers begin to do things beyond the ability of classical machines.


Quantum Computational Intelligence

#artificialintelligence

Imagine solving mathematical problems where you could use the full physical range of computational possibilities within the laws of the universe, and be inspired by the sublime algorithmic intelligence of the human brain. This is precisely why the emerging field of quantum machine learning (QML) has received so much recent attention. In this blog post, we'd like to discuss the fundamental ideas and applied value of machine learning to computation in general, and then contextualize these ideas in a new way within the paradigm of quantum computation. Machine learning – a subfield of computer science related to computational statistics and pattern recognition – emerged in its modern incarnation in the mid-late 20th century as researchers attempted to build thinking machines. While first-generation artificial intelligence took inspiration from the computers of the 1980s to reason about intelligence and view humans like deterministic, syntactical machines, contemporary artificial intelligence instead chooses to build machines that have the adaptability and variability of human in "coping" with the ill-defined problem of being an individual with incomplete information in a complex world.


Google has reached quantum supremacy – here's what it should do next

New Scientist

Quantum computing is now ready to go – or is it? Google appears to have reached an impressive milestone known as quantum supremacy, where a quantum computer is able to perform a calculation that is practically impossible for a classical one. But there are plenty of hurdles left to jump over before the technology hits the big time. For a start, the processors need to be more powerful. Unlike classical computers, which store data as either a 0 or a 1, quantum computers use qubits that store data as a mixture of these two states.


Beyond quantum supremacy: the hunt for useful quantum computers

#artificialintelligence

Just occasionally, Alán Aspuru-Guzik has a movie-star moment, when fans half his age will stop him in the street. "They say, 'Hey, we know who you are'," he laughs. "Then they tell me that they also have a quantum start-up, and would love to talk to me about it." "I don't usually have time to talk, but I'm always happy to give them some tips." That affable approach is not uncommon in the quantum-computing community, says Aspuru-Guzik, who is a computer scientist at the University of Toronto, Canada, and co-founder of quantum-computing company Zapata Computing in Cambridge, Massachusetts.


IBM finally proves that quantum systems are faster than classicals

Engadget

In 1994, MIT professor of applied mathematics, Peter Shor, developed a groundbreaking quantum computing algorithm capable of factoring numbers (that is, finding the prime numbers for any integer N) using quantum computer technology. For the next decade, this algorithm provided a tantalizing glimpse at the potential prowess of quantum computing versus classical systems. However researchers could never definitively prove that quantum would always be faster in this application or whether classical systems could overtake quantum if given a sufficiently robust algorithm of its own. In a paper published Thursday in the journal Science, Dr. Sergey Bravyi and his team reveal that they've developed a mathematical proof which, in specific cases, illustrates the quantum algorithm's inherent computational advantages over classical. "It's good to know, because results like this become parts of algorithms," Bob Sutor, vice president of IBM Q Strategy and Ecosystem, told Engadget.