Review for NeurIPS paper: Convergence and Stability of Graph Convolutional Networks on Large Random Graphs
–Neural Information Processing Systems
Summary and Contributions: This paper presents theoretical analysis of convergence and stability properties of GCNs on large random graphs. It introduces continuous GCNs (c-GCN) that act on a bounded, piecewise-Lipschitz function of unobserved latent node variables which are linked through a similarity kernel. It has two main contributions. Firstly, it studies notions of invariance and equivariance to isomorphism of random graph models, and give convergence results of discrete GCNs to c-GCNs for large graphs. Specifically, for the invariant case the authors claim that the output of both networks lie in the same output space.
Neural Information Processing Systems
Feb-8-2025, 07:20:26 GMT
- Technology: