A Short Note on Upper Bounds for Graph Neural Operator Convergence Rate
ABSTRACT Graphons, as limits of graph sequences, provide a framework for analyzing the asymptotic behavior of graph neural operators. Spectral convergence of sampled graphs to graphons yields operator-level convergence rates, enabling transferability analyses of GNNs. This note summarizes known bounds under no assumptions, global Lipschitz continuity, and piecewise-Lipschitz continuity, highlighting tradeoffs between assumptions and rates, and illustrating their empirical tightness on synthetic and real data. Index T erms-- graph neural operator, graphon, convergence rates, graph neural networks, transferability 1. INTRODUCTION Graph neural networks (GNNs) are widely used in drug discovery [1, 2], social networks [3, 4], recommendation systems [5], and NLP [6, 7, 8]. GNNs operate on graph-structured data via message passing and aggregation [9], but training on large graphs is computationally expensive.
Oct-27-2025
- Country:
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.04)
- North America > United States
- Rhode Island > Providence County > Providence (0.04)
- South America > Chile
- Europe > United Kingdom
- Genre:
- Research Report (0.40)
- Industry:
- Health & Medicine (0.35)
- Technology: