design provably
Distance Encoding: Design Provably More Powerful Neural Networks for Graph Representation Learning
Learning representations of sets of nodes in a graph is crucial for applications ranging from node-role discovery to link prediction and molecule classification. Graph Neural Networks (GNNs) have achieved great success in graph representation learning. However, expressive power of GNNs is limited by the 1-Weisfeiler-Lehman (WL) test and thus GNNs generate identical representations for graph substructures that may in fact be very different. More powerful GNNs, proposed recently by mimicking higher-order-WL tests, only focus on representing entire graphs and they are computationally inefficient as they cannot utilize sparsity of the underlying graph. Here we propose and mathematically analyze a general class of structure-related features, termed Distance Encoding (DE).
Review for NeurIPS paper: Distance Encoding: Design Provably More Powerful Neural Networks for Graph Representation Learning
Summary and Contributions: [edit: see "additional feedback" section for response to author rebuttal] This paper considers the problem of learning representations of specific subgraphs of a larger graph, in such a way that subgraphs of non-isomorphic graphs should be assigned different representations. Specifically, they note that most GNNs can only distinguish things that the 1-WL test can distinguish, and propose a new method of computing graph features (called "distance encoding") to get around this restriction. The distance encoding the authors propose appears to work as follows. Suppose we want to compute a representation of a subgraph S of a graph G. Instead of running a GNN on G and then pooling across the subgraph, the subgraph S is used to modify the features of all of the nodes in G, in particular based on the distances between those nodes in G and the nodes in S (this is DEGNN). Optionally, the subgraph is also used to modify the edge structure by, conceptually, adding additional edges and/or edge features based on pairwise distances in G (this is DEAGNN).
Review for NeurIPS paper: Distance Encoding: Design Provably More Powerful Neural Networks for Graph Representation Learning
This exciting paper introduces some interesting and novel theoretical contributions to the graph neural network literature. The authors also verified some of their theoretical findings empirically as well. This paper is worth presenting at NeurIPS with the condition that the authors will address the concerns raised by the reviewers on writing and clarity. This paper has valuable contributions in better characterizing 1-WL's power, yet it steps too large directly to using distance while skipping the discussion of those somewhat more straightforward conditioning ways (such as annotation, etc.) It seems like there is a too big jump from 1-WL directly to DE without discussing how much you gain by just doing more straightforward conditioning.
Distance Encoding: Design Provably More Powerful Neural Networks for Graph Representation Learning
Learning representations of sets of nodes in a graph is crucial for applications ranging from node-role discovery to link prediction and molecule classification. Graph Neural Networks (GNNs) have achieved great success in graph representation learning. However, expressive power of GNNs is limited by the 1-Weisfeiler-Lehman (WL) test and thus GNNs generate identical representations for graph substructures that may in fact be very different. More powerful GNNs, proposed recently by mimicking higher-order-WL tests, only focus on representing entire graphs and they are computationally inefficient as they cannot utilize sparsity of the underlying graph. Here we propose and mathematically analyze a general class of structure-related features, termed Distance Encoding (DE).