Layer-Dependent Importance Sampling for Training Deep and Large Graph Convolutional Networks
Difan Zou, Ziniu Hu, Yewen Wang, Song Jiang, Yizhou Sun, Quanquan Gu
–Neural Information Processing Systems
Graph convolutional networks (GCNs) have recently received wide attentions, due to their successful applications in different graph tasks and different domains. Training GCNs for a large graph, however, is still a challenge. Original full-batch GCN training requires calculating the representation of all the nodes in the graph per GCN layer, which brings in high computation and memory costs. To alleviate this issue, several sampling-based methods have been proposed to train GCNs on a subset of nodes. Among them, the node-wise neighbor-sampling method recursively samples a fixed number of neighbor nodes, and thus its computation cost suffers from exponential growing neighbor size; while the layer-wise importance-sampling method discards the neighbor-dependent constraints, and thus the nodes sampled across layer suffer from sparse connection problem.
Neural Information Processing Systems
Jan-25-2025, 17:22:57 GMT
- Country:
- Europe (0.68)
- North America > United States
- New York (0.15)
- Genre:
- Research Report > New Finding (0.46)
- Technology: