A Short Note on Upper Bounds for Graph Neural Operator Convergence Rate

Holden, Roxanne, Ruiz, Luana

arXiv.org Machine Learning 

ABSTRACT Graphons, as limits of graph sequences, provide a framework for analyzing the asymptotic behavior of graph neural operators. Spectral convergence of sampled graphs to graphons yields operator-level convergence rates, enabling transferability analyses of GNNs. This note summarizes known bounds under no assumptions, global Lipschitz continuity, and piecewise-Lipschitz continuity, highlighting tradeoffs between assumptions and rates, and illustrating their empirical tightness on synthetic and real data. Index T erms-- graph neural operator, graphon, convergence rates, graph neural networks, transferability 1. INTRODUCTION Graph neural networks (GNNs) are widely used in drug discovery [1, 2], social networks [3, 4], recommendation systems [5], and NLP [6, 7, 8]. GNNs operate on graph-structured data via message passing and aggregation [9], but training on large graphs is computationally expensive.