Boosting the Cycle Counting Power of Graph Neural Networks with I$^2$-GNNs

Huang, Yinan, Peng, Xingang, Ma, Jianzhu, Zhang, Muhan

arXiv.org Artificial Intelligence 

Message Passing Neural Networks (MPNNs) are a widely used class of Graph Neural Networks (GNNs). However, knowing one model is more powerful than another gives little insight about what functions they can or cannot express. It is still unclear whether these models are able to approximate specific functions such as counting certain graph substructures, which is essential for applications in biology, chemistry and social network analysis. Motivated by this, we propose to study the counting power of Subgraph MPNNs, a recent and popular class of powerful GNN models that extract rooted subgraphs for each node, assign the root node a unique identifier and encode the root node's representation within its rooted subgraph. Specifically, we prove that Subgraph MPNNs fail to count more-than-4-cycles at node level, implying that node representations cannot correctly encode the surrounding substructures like ring systems with more than four atoms. To the best of our knowledge, it is the first linear-time GNN model that can count 6-cycles with theoretical guarantees. We validate its counting power in cycle counting tasks and demonstrate its competitive performance in molecular prediction benchmarks. Relational and structured data are usually represented by graphs. Representation learning over graphs with Graph Neural Networks (GNNs) has achieved remarkable results in drug discovery, computational chemistry, combinatorial optimization and social network analysis [40, 41, 42, 43, 44, 45, 46]. Among various GNNs, Message Passing Neural Network (MPNN) is one of the most commonly used GNNs [47, 48, 49]. However, the representational power of MPNNs is shown to be limited by the Weisfeiler-Lehman (WL) test [4, 5], a classical algorithm for graph isomorphism test. MPNNs cannot recognize even some simple substructures like cycles [19]. It leads to increasing attention on studying the representational power of different GNNs and designing more powerful GNN models.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found