Reviews: Understanding Attention and Generalization in Graph Neural Networks

Neural Information Processing Systems 

UPDATE: I have increased the score to 6 as long as the authors will revise the paper as promised in the responses. This paper has more than one topic being discussed. It at the first part talks mostly about the attention mechanism, and in the second section it introduces a new model ChebyGIN, then in the third section it proposed a weakly-supervised attention training approach. Overall, the paper is not all about its title "Understanding Attention in Graph Neural Networks". In 2.3 the paper says "the performance of both GCNs and GINs is quite poor and, consequently, it is also hard for the attention subnetwork to learn", thus it proposes ChebyGIN as a stronger model.