Goto

Collaborating Authors

 Wiklicky, Herbert


Tunable Quantum Neural Networks in the QPAC-Learning Framework

arXiv.org Artificial Intelligence

In this paper, we investigate the performances of tunable quantum neural networks in the Quantum Probably Approximately Correct (QPAC) learning framework. Tunable neural networks are quantum circuits made of multi-controlled X gates. By tuning the set of controls these circuits are able to approximate any Boolean functions. This architecture is particularly suited to be used in the QPAC-learning framework as it can handle the superposition produced by the oracle. In order to tune the network so that it can approximate a target concept, we have devised and implemented an algorithm based on amplitude amplification. The numerical results show that this approach can efficiently learn concepts from a simple class.


On the Non-Existence of a Universal Learning Algorithm for Recurrent Neural Networks

Neural Information Processing Systems

We prove that the so called "loading problem" for (recurrent) neural networks isunsolvable. This extends several results which already demonstrated thattraining and related design problems for neural networks are (at least) NPcomplete. Our result also implies that it is impossible to find or to formulate a universal training algorithm, which for any neural networkarchitecture could determine a correct set of weights. For the simple proof of this, we will just show that the loading problem is equivalent to "Hilbert's tenth problem" which is known to be unsolvable.