Re-uploading quantum data: A universal function approximator for quantum inputs

Cha, Hyunho, Park, Daniel K., Lee, Jungwoo

arXiv.org Artificial Intelligence 

Quantum machine learning (QML) seeks to harness quantum computation to enhance machine learning tasks [1, 2, 3, 4, 5]. Quantum computers can perform certain linear algebra subroutines faster than classical machines under state preparation assumptions [6, 7, 8]. Motivated by such potential quantum speedups, a variety of QML models have been explored--from quantum kernel methods to variational quantum circuits--all aiming to outperform their classical counterparts [9, 10, 11, 12, 13, 14, 15, 16]. A key component of any QML model is how data are encoded into and processed by quantum circuits [17, 18, 19, 20, 21, 22, 23, 24, 25, 26]. For classical input data, one common approach is to embed the data into a quantum state through parameterized gate operations. Recent work has shown that repeatedly encoding data within a circuit-- a technique known as data re-uploading--enhances a model's expressive power, and in particular, that even a single qubit can serve as a universal quantum classifier [27, 28, 18, 29].