Review for NeurIPS paper: Distance Encoding: Design Provably More Powerful Neural Networks for Graph Representation Learning

Neural Information Processing Systems 

This exciting paper introduces some interesting and novel theoretical contributions to the graph neural network literature. The authors also verified some of their theoretical findings empirically as well. This paper is worth presenting at NeurIPS with the condition that the authors will address the concerns raised by the reviewers on writing and clarity. This paper has valuable contributions in better characterizing 1-WL's power, yet it steps too large directly to using distance while skipping the discussion of those somewhat more straightforward conditioning ways (such as annotation, etc.) It seems like there is a too big jump from 1-WL directly to DE without discussing how much you gain by just doing more straightforward conditioning.