Quantum Long Short-term Memory with Differentiable Architecture Search

Chen, Samuel Yen-Chi, Tiwari, Prayag

arXiv.org Artificial Intelligence 

To address this challenge, we propose a differentiable quantum architecture search framework (DiffQAS) integrated into the QLSTM model. This approach enables end-to-end training of both the conventional circuit parameters and the architectural control parameters that determine the contribution of candidate variational quantum circuits. Through comprehensive numerical experiments, we show that the resulting DiffQAS-QLSTM framework outperforms baseline QLSTM models with manually designed circuits on benchmark time-series prediction tasks. We envision that this framework will facilitate the adoption of QML models, especially quantum sequence learners, by a broader range of domain experts, bridging the gap between quantum algorithm design and practical applications.