Goto

Collaborating Authors

Weekly Machine Learning Research Paper Reading List -- #9

#artificialintelligence

This paper investigates data dependent kernels that are derived directly from data. This has been an outstanding issue for about two decades which hampered the development of kernel-based methods. We introduce Isolation Kernel which is solely dependent on data distribution, requiring neither class information nor explicit learning to be a classifier. In contrast, existing data dependent kernels rely heavily on class information and explicit learning to produce a classifier. We show that Isolation Kernel approximates well to a data independent kernel function called Laplacian kernel under uniform density distribution. With this revelation, Isolation Kernel can be viewed as a data dependent kernel that adapts a data independent kernel to the structure of a dataset.


Scientists introduce new method for machine learning classifications in quantum computing

#artificialintelligence

Quantum information scientists have introduced a new method for machine-learning classifications in quantum computing. The non-linear quantum kernels in a quantum binary classifier provide new insights for improving the accuracy of quantum machine learning, deemed able to outperform the current AI technology. The research team led by Professor June-Koo Kevin Rhee from the School of Electrical Engineering, proposed a quantum classifier based on quantum state fidelity by using a different initial state and replacing the Hadamard classification with a swap test. Unlike the conventional approach, this method is expected to significantly enhance the classification tasks when the training dataset is small, by exploiting the quantum advantage in finding non-linear features in a large feature space. Quantum machine learning holds promise as one of the imperative applications for quantum computing.


Visualizing parameterized quantum classifiers.

#artificialintelligence

The goal of this post is to explain the parameterized binary classifier of a quantum state in the case of 1 qubit. This simple case allows us to have some nice visualizations and even some analytical expressions. The first part will state the results and display the figures, whereas the technical proofs will be left to the second part. So if you don't like equations, you can stop reading after the first part. I encourage everyone not familiar with the spherical coordinates to have a look here, since it is the core of this post.


Quantum machine learning models are kernel methods

arXiv.org Machine Learning

With near-term quantum devices available and the race for fault-tolerant quantum computers in full swing, researchers became interested in the question of what happens if we replace a machine learning model with a quantum circuit. While such "quantum models" are sometimes called "quantum neural networks", it has been repeatedly noted that their mathematical structure is actually much more closely related to kernel methods: they analyse data in high-dimensional Hilbert spaces to which we only have access through inner products revealed by measurements. This technical manuscript summarises, formalises and extends the link by systematically rephrasing quantum models as a kernel method. It shows that most near-term and fault-tolerant quantum models can be replaced by a general support vector machine whose kernel computes distances between data-encoding quantum states. In particular, kernel-based training is guaranteed to find better or equally good quantum models than variational circuit training. Overall, the kernel perspective of quantum machine learning tells us that the way that data is encoded into quantum states is the main ingredient that can potentially set quantum models apart from classical machine learning models.


IBM shows quantum computers can solve these problems that classical computers find hard

ZDNet

The most standard example of a classification problem is when a computer is given pictures of dogs and cats, and is required to label all future images it sees as either a dog or a cat. Among some of the most promising applications of quantum computing, quantum machine learning is expected to make waves, but how exactly remains somewhat of a mystery. In what could shed light on how realistic those expectations are, IBM's researchers are now claiming that they have mathematically proven that, by using a quantum approach, certain machine-learning problems can be solved exponentially faster than they would be with classical computers. Machine learning is a well-established branch of artificial intelligence that is already used in many industries to solve a variety of business problems. The approach consists of training an algorithm with large datasets, to enable the model to identify different patterns and eventually calculate the best answer when presented with new information.