Towards Efficient Training of Graph Neural Networks: A Multiscale Approach
Gal, Eshed, Eliasof, Moshe, Schönlieb, Carola-Bibiane, Haber, Eldad, Treister, Eran
–arXiv.org Artificial Intelligence
Graph Neural Networks (GNNs) have emerged as a powerful tool for learning and inferring from graph-structured data, and are widely used in a variety of applications, often considering large amounts of data and large graphs. However, training on such data requires large memory and extensive computations. In this paper, we introduce a novel framework for efficient multiscale training of GNNs, designed to integrate information across multiscale representations of a graph. Our approach leverages a hierarchical graph representation, taking advantage of coarse graph scales in the training process, where each coarse scale graph has fewer nodes and edges. Based on this approach, we propose a suite of GNN training methods: such as coarse-to-fine, sub-to-full, and multiscale gradient computation. We demonstrate the effectiveness of our methods on various datasets and learning tasks.
arXiv.org Artificial Intelligence
Mar-26-2025
- Country:
- Europe > United Kingdom (0.28)
- North America (0.28)
- Genre:
- Research Report > New Finding (0.46)
- Industry:
- Health & Medicine (0.93)
- Information Technology > Services (0.46)
- Technology: