On the Effectiveness of Random Weights in Graph Neural Networks
Bui, Thu, Schönlieb, Carola-Bibiane, Ribeiro, Bruno, Bevilacqua, Beatrice, Eliasof, Moshe
Graph Neural Networks (GNNs) have achieved remarkable success across diverse tasks on graph-structured data, primarily through the use of learned weights in message passing layers. In this paper, we demonstrate that random weights can be surprisingly effective, achieving performance comparable to end-to-end training counterparts, across various tasks and datasets. Specifically, we show that by replacing learnable weights with random weights, GNNs can retain strong predictive power, while significantly reducing training time by up to 6$\times$ and memory usage by up to 3$\times$. Moreover, the random weights combined with our construction yield random graph propagation operators, which we show to reduce the problem of feature rank collapse in GNNs. These understandings and empirical results highlight random weights as a lightweight and efficient alternative, offering a compelling perspective on the design and training of GNN architectures.
Jan-31-2025
- Country:
- North America > United States > Indiana > Tippecanoe County (0.14)
- Genre:
- Research Report > New Finding (1.00)
- Technology: