Review for NeurIPS paper: Graph Cross Networks with Vertex Infomax Pooling

Neural Information Processing Systems 

The paper make a novel contribution by introducing graph cross networks, and demonstrate it usefulness in practical example. While initial concern related to the clarity of the paper, the reviewers found that the authors have done a good job in summarizing their work and addressed most of their concerns in the rebuttal. The two key components of GXN are a novel vertex infomax pooling, which creates multiscale graphs in a trainable manner and a novel feature crossing layer, enabling feature interchange across scales. This work has been compared their work with prior methods and surpassed all of them, which meets the bar for a NeurIPS presentation. While it does not impact the decision, during the discussion, the following points were left unanswered, and it would be great if the authors could take the following points in their reviews: (1) In VIPool, P_v, P_n, P_{v,n} are all discrete distributions (although the feature vector can be continuous, as there are V nodes, the sample from P_v can only have at most V values, so it is a discrete distribution).