Generalization Error of Graph Neural Networks in the Mean-field Regime
Aminian, Gholamali, He, Yixuan, Reinert, Gesine, Szpruch, Łukasz, Cohen, Samuel N.
–arXiv.org Artificial Intelligence
This work provides a theoretical framework for assessing the generalization error of graph classification tasks via graph neural networks in the over-parameterized regime, where the number of parameters surpasses the quantity of data points. We explore two widely utilized types of graph neural networks: graph convolutional neural networks and message passing graph neural networks. Prior to this study, existing bounds on the generalization error in the over-parametrized regime were uninformative, limiting our understanding of over-parameterized network performance. Our novel approach involves deriving upper bounds within the mean-field regime for evaluating the generalization error of these graph neural networks. We establish upper bounds with a convergence rate of $O(1/n)$, where $n$ is the number of graph samples. These upper bounds offer a theoretical assurance of the networks' performance on unseen data in the challenging over-parameterized regime and overall contribute to our understanding of their performance.
arXiv.org Artificial Intelligence
Feb-10-2024
- Country:
- Europe > United Kingdom
- England (0.14)
- North America > United States (0.67)
- Europe > United Kingdom
- Genre:
- Research Report > Promising Solution (0.34)
- Industry:
- Information Technology (0.45)
- Technology: