Reviews: Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

Neural Information Processing Systems 

This paper builds upon previous work to propose a computationally efficient model that extends convolutions to irregular graphs. The major innovation is to express graph filtering with a compactly supported kernels as linear combinations of order K polynomials of the graph Laplacian, thus overcoming the burden of computing the Laplacian eigendecomposition, which is needed in both forward and backward passes in the previous spectral construction. Moreover, the Chebyshev recurrence implies that order K filters can be computed with O(K E) operations, where E is the number of edges and controls the cost of computing L x (L Laplacian operator on the graph). This appears to be a very clean, efficient parametrization that translates into efficient learning and evaluation and overcomes an important computational limitation. I have a couple of comments that I would like the authors to develop during the rebuttal phase: - A previous model that operates in general graphs is the so-called Graph Neural Network (GNN) [Scarselli et al.,'09].