Do GNN-based QEC Decoders Require Classical Knowledge? Evaluating the Efficacy of Knowledge Distillation from MWPM
–arXiv.org Artificial Intelligence
Quantum computers hold the potential to outperform classical computers on specific computational problems, but their realization is hindered by the fragility of qubits due to decoherence. Quantum Error Correction (QEC) is an essential technology to overcome this challenge, enabling the detection and correction of errors by redundantly encoding a single logical qubit into multiple physical qubits. The performance of QEC is critically dependent on the classical "decoder" algorithm, which interprets the error syndrome to deduce the appropriate correction operation. The standard decoder for the surface code, Minimum-Weight Perfect Matching (MWPM) [1], performs well under a simplified noise model where errors are assumed to be independent and identically distributed (i.i.d.). However, noise in real quantum devices exhibits complex spatio-temporal correlations, and the discrepancy between the theoretical model and reality can degrade the decoder's performance. To address this, decoders based on machine learning, such as Graph Neural Networks (GNNs), have emerged as a promising alternative [2, 3]. GNNs have the ability to learn error patterns directly from data. It is generally anticipated that injecting physical prior knowledge into a GNN should improve its performance. Specifically, "knowledge distillation" [4], which transfers the knowledge of theoretical error structures from MWPM to a GNN, is considered a concrete method to realize this hypothesis. 1
arXiv.org Artificial Intelligence
Aug-7-2025