The Algorithm That Changed Quantum Machine Learning

Communications of the ACM

It's not every day that an 18-year-old college student catches the eye of the computing world, but when Ewin Tang took aim at recommendation algorithms similar to those commonly used by the likes of Amazon and Netflix, the University of Texas at Austin mathematics and computer science undergraduate blew up an established belief: that classical computers cannot perform these types of calculations at the speed of quantum computers. In a July 2018 paper, which Tang wrote for a senior honors thesis under the supervision of computer science professor Scott Aaronson, a leading researcher in quantum computing algorithms, she discovered an algorithm that showed classical computers can indeed tackle predictive recommendations at a speed previously thought possible only with quantum computers. "I actually set out to demonstrate that quantum machine learning algorithms are faster," she explains. "But, along the way, I realized this was not the case." Ewin Tang set out to show that quantum machine learning algorithms are faster than classical algorithms, "but ... I realized this was not the case."


Has the age of quantum computing arrived?

The Guardian

Ever since Charles Babbage's conceptual, unrealised Analytical Engine in the 1830s, computer science has been trying very hard to race ahead of its time. Particularly over the last 75 years, there have been many astounding developments – the first electronic programmable computer, the first integrated circuit computer, the first microprocessor. But the next anticipated step may be the most revolutionary of all. Quantum computing is the technology that many scientists, entrepreneurs and big businesses expect to provide a, well, quantum leap into the future. If you've never heard of it there's a helpful video doing the social media rounds that's got a couple of million hits on YouTube.



Quantum Hype and Quantum Skepticism

Communications of the ACM

The first third of the 20th century saw the collapse of many absolutes. Albert Einstein's 1905 special relativity theory eliminated the notion of absolute time, while Kurt Gödel's 1931 incompleteness theorem questioned the notion of absolute mathematical truth. Most profoundly, however, quantum mechanics raised doubts on the notion of absolute objective reality. Is Schrödinger's cat dead or alive? Nearly 100 years after quantum mechanics was introduced, scientists still are not in full agreement on what it means.


A Hybrid of Quantum Computing and Machine Learning Is Spawning New Ventures

IEEE Spectrum Robotics

Machine learning, the field of AI that allows Alexa and Siri to parse what you say and self-driving cars to safely drive down a city street, could benefit from quantum computer-derived speedups, say researchers. And if a technology incubator program in Toronto, Canada has its way, there may even be quantum machine learning startup companies launching in a few years too. Research in this hybrid field today concentrates on either using nascent quantum computers to speed up machine learning algorithms or, using conventional machine learning systems, to increase the power, durability, or effectiveness of quantum computer systems. An ultimate goal in the field is to do both -- use smaller quantum-computer-based machine learning systems to better improve, understand, or interpret large datasets of quantum information or the results of large-scale quantum computer calculations. This last goal will of course have to wait till large-scale quantum information storage and full-fledged quantum computers come online.