Goto

Collaborating Authors

qubit


Solving 'barren plateaus' is the key to quantum machine learning

#artificialintelligence

Many machine learning algorithms on quantum computers suffer from the dreaded "barren plateau" of unsolvability, where they run into dead ends on optimization problems. This challenge had been relatively unstudied--until now. Rigorous theoretical work has established theorems that guarantee whether a given machine learning algorithm will work as it scales up on larger computers. "The work solves a key problem of useability for quantum machine learning. We rigorously proved the conditions under which certain architectures of variational quantum algorithms will or will not have barren plateaus as they are scaled up," said Marco Cerezo, lead author on the paper published in Nature Communications today by a Los Alamos National Laboratory team.


'Discovery Accelerator,' a new Cleveland Clinic-IBM partnership, will use quantum computer, artificial intelligence to speed up medical innovations

#artificialintelligence

The Cleveland Clinic and IBM have entered a 10-year partnership that will install a quantum computer -- which can handle large amounts of data at lightning speeds -- at the Clinic next year to speed up medical innovations. The Discovery Accelerator, a joint Clinic-IBM center, will feature artificial intelligence, hybrid cloud data storage and quantum computing technologies. A hybrid cloud is a data storage technology that allows for faster storage and analysis of large amounts of data. The partnership will allow Clinic researchers to use the advanced tech in its new Global Center for Pathogen Research and Human Health for research into genomics, population health, clinical applications, and chemical and drug discovery. The Center for Global and Emerging Pathogens Research studies emerging pathogens -- such as Zika and COVID-19 -- and seeks to develop treatments and vaccines to fight the next public health threat.


Quantum computing takes another big step towards the mainstream in this new research lab

ZDNet

The Cleveland Clinic will rely on state-of-the-art IBM technology to support its latest public health project. High-performance cloud computing, artificial intelligence, and a couple of quantum computers: IBM is going all-in with a freshly signed, decade-long partnership that will see Big Blue provide the technology infrastructure for a new research center dedicated to public health threats such as the COVID-19 pandemic. The Ohio-based Cleveland Clinic, a non-profit institution that combines clinical and hospital care with medical research and education, will use state-of-the-art IBM technology to support its latest project: a global center for pathogen research and human health. Supported by a $500 million investment, the new center will be dedicated to the study of viral pathogens, virus-induced diseases, genomics, immunology and immunotherapies. To assist researchers' work preparing for and protecting against emerging pathogens, IBM has designed a "Discovery Accelerator" – contributing the company's latest capabilities to better support data-based scientific work and fast-track the discovery of new treatments.


Solving 'barren plateaus' is the key to quantum machine learning

#artificialintelligence

"The work solves a key problem of useability for quantum machine learning. We rigorously proved the conditions under which certain architectures of variational quantum algorithms will or will not have barren plateaus as they are scaled up," said Marco Cerezo, lead author on the paper published in Nature Communications today by a Los Alamos National Laboratory team. Cerezo is a post doc researching quantum information theory at Los Alamos. "With our theorems, you can guarantee that the architecture will be scalable to quantum computers with a large number of qubits." "Usually the approach has been to run an optimization and see if it works, and that was leading to fatigue among researchers in the field," said Patrick Coles, a coauthor of the study.



FSU researchers enhance quantum machine learning algorithms

#artificialintelligence

Newswise -- A Florida State University professor's research could help quantum computing fulfill its promise as a powerful computational tool. William Oates, the Cummins Inc. Professor in Mechanical Engineering and chair of the Department of Mechanical Engineering at the FAMU-FSU College of Engineering, and postdoctoral researcher Guanglei Xu found a way to automatically infer parameters used in an important quantum Boltzmann machine algorithm for machine learning applications. Their findings were published in Scientific Reports. The work could help build artificial neural networks that could be used for training computers to solve complicated, interconnected problems like image recognition, drug discovery and the creation of new materials. "There's a belief that quantum computing, as it comes online and grows in computational power, can provide you with some new tools, but figuring out how to program it and how to apply it in certain applications is a big question," Oates said.


FSU Researchers Report Enhanced Quantum Machine Learning Algorithms - insideHPC

#artificialintelligence

Florida State University researchers report they have found a way to automatically infer parameters used in an important quantum Boltzmann machine algorithm for machine learning applications. The work could help build artificial neural networks used for training computers to solve complicated, interconnected problems, such as image recognition, drug discovery and the creation of new materials. The findings of Professor William Oates, the Cummins Inc. Professor in Mechanical Engineering and chair of the Department of Mechanical Engineering at the FAMU-FSU College of Engineering, and postdoctoral researcher Guanglei Xu were published in Scientific Reports. "There's a belief that quantum computing, as it comes online and grows in computational power, can provide you with some new tools, but figuring out how to program it and how to apply it in certain applications is a big question," said Oates. Quantum bits, unlike binary bits in a standard computer, can exist in more than one state at a time, a concept known as superposition.


FSU researchers enhance quantum machine learning algorithms - Florida State University News

#artificialintelligence

A Florida State University professor's research could help quantum computing fulfill its promise as a powerful computational tool. William Oates, the Cummins Inc. Professor in Mechanical Engineering and chair of the Department of Mechanical Engineering at the FAMU-FSU College of Engineering, and postdoctoral researcher Guanglei Xu found a way to automatically infer parameters used in an important quantum Boltzmann machine algorithm for machine learning applications. Their findings were published in Scientific Reports. The work could help build artificial neural networks that could be used for training computers to solve complicated, interconnected problems like image recognition, drug discovery and the creation of new materials. "There's a belief that quantum computing, as it comes online and grows in computational power, can provide you with some new tools, but figuring out how to program it and how to apply it in certain applications is a big question," Oates said.


How and when quantum computers will improve machine learning?

#artificialintelligence

There is a strong hope (and hype) that Quantum Computers will help machine learning in many ways. Research in Quantum Machine Learning (QML) is a very active domain, and many small and noisy quantum computers are now available. Different approaches exist, for both long term and short term, and we may wonder what are their respective hopes and limitations, both in theory and in practice? It all started in 2009 with the publications of the "HHL" Algorithm [1] proving an exponential acceleration for matrix multiplication and inversion, which triggered exciting applications in all linear algebra-based science, hence machine learning. Since, many algorithms were proposed to speed up tasks such as classification [2], dimensionality reduction [3], clustering [4], recommendation system [5], neural networks [6], kernel methods [7], SVM [8], reinforcement learning [9], and more generally optimization [10].


Solving 'barren plateaus' is the key to quantum machine learning

#artificialintelligence

IMAGE: A barren plateau is a trainability problem that occurs in machine learning optimization algorithms when the problem-solving space turns flat as the algorithm is run. LOS ALAMOS, N.M., March 19, 2021--Many machine learning algorithms on quantum computers suffer from the dreaded "barren plateau" of unsolvability, where they run into dead ends on optimization problems. This challenge had been relatively unstudied--until now. Rigorous theoretical work has established theorems that guarantee whether a given machine learning algorithm will work as it scales up on larger computers. "The work solves a key problem of useability for quantum machine learning. We rigorously proved the conditions under which certain architectures of variational quantum algorithms will or will not have barren plateaus as they are scaled up," said Marco Cerezo, lead author on the paper published in Nature Communications today by a Los Alamos National Laboratory team.