Goto

Collaborating Authors

 natural graph network



Natural Graph Networks

Neural Information Processing Systems

A key requirement for graph neural networks is that they must process a graph in a way that does not depend on how the graph is described. Traditionally this has been taken to mean that a graph network must be equivariant to node permutations. Here we show that instead of equivariance, the more general concept of naturality is sufficient for a graph network to be well-defined, opening up a larger class of graph networks. We define global and local natural graph networks, the latter of which are as scalable as conventional message passing graph neural networks while being more flexible. We give one practical instantiation of a natural network on graphs which uses an equivariant message network parameterization, yielding good performance on several benchmarks.


Review for NeurIPS paper: Natural Graph Networks

Neural Information Processing Systems

Additional Feedback: Specific notes: Line 132, "local symmetries form a superset of the global symmetries": I'm not sure what this means. Local and global symmetries are different types of object, right? Is this just intended as meaning that global symmetries restrict to local ones? Line 147: "it is necessary that the feature vector ... transforms ... rather than remain invariant" Is this actually true, and if so, what is the justification? It seems to me like using a non-transforming feature vector with a powerful edge kernel could still be possible?


Natural Graph Networks

Neural Information Processing Systems

A key requirement for graph neural networks is that they must process a graph in a way that does not depend on how the graph is described. Traditionally this has been taken to mean that a graph network must be equivariant to node permutations. Here we show that instead of equivariance, the more general concept of naturality is sufficient for a graph network to be well-defined, opening up a larger class of graph networks. We define global and local natural graph networks, the latter of which are as scalable as conventional message passing graph neural networks while being more flexible. We give one practical instantiation of a natural network on graphs which uses an equivariant message network parameterization, yielding good performance on several benchmarks.


Natural Graph Networks

de Haan, Pim, Cohen, Taco, Welling, Max

arXiv.org Machine Learning

Conventional neural message passing algorithms are invariant under permutation of the messages and hence forget how the information flows through the network. Studying the local symmetries of graphs, we propose a more general algorithm that uses different kernels on different edges, making the network equivariant to local and global graph isomorphisms and hence more expressive. Using elementary category theory, we formalize many distinct equivariant neural networks as natural networks, and show that their kernels are 'just' a natural transformation between two functors. We give one practical instantiation of a natural network on graphs which uses a equivariant message network parameterization, yielding good performance on several benchmarks.