Distance Encoding: Design Provably More Powerful Neural Networks for Graph Representation Learning Yanbang Wang Department of Computer Science Department of Computer Science Purdue University
–Neural Information Processing Systems
Learning representations of sets of nodes in a graph is crucial for applications ranging from node-role discovery to link prediction and molecule classification. Graph Neural Networks (GNNs) have achieved great success in graph representation learning. However, expressive power of GNNs is limited by the 1-Weisfeiler-Lehman (WL) test and thus GNNs generate identical representations for graph substructures that may in fact be very different. More powerful GNNs, proposed recently by mimicking higher-order-WL tests, only focus on representing entire graphs and they are computationally inefficient as they cannot utilize sparsity of the underlying graph. Here we propose and mathematically analyze a general class of structurerelated features, termed Distance Encoding (DE).
Neural Information Processing Systems
May-28-2025, 21:37:08 GMT
- Country:
- Europe > United Kingdom
- England (0.14)
- North America > United States (0.68)
- Europe > United Kingdom
- Industry:
- Information Technology (0.46)
- Transportation (0.46)
- Technology: