2e4531a45b99f61947b23ccdd608303b-Paper-Conference.pdf

Neural Information Processing Systems 

Equivariant Graph Neural Networks (GNNs) that incorporate the E(3) symmetry have achieved significant success in various scientific applications. As one of the most successful models, EGNN [1] leverages a simple scalarization technique to perform equivariant message passing over only Cartesian vectors (i.e., 1stdegree steerable vectors), enjoying greater efficiency and efficacy compared to equivariant GNNs using higher-degree steerable vectors. This success suggests that higher-degree representations might be unnecessary. In this paper, we disprove this hypothesis by exploring the expressivity of equivariant GNNs on symmetric structures, including k-fold rotations and regular polyhedra. We theoretically demonstrate that equivariant GNNs will always degenerate to a zero function if the degree of the output representations is fixed to 1 or other specific values. Based on this theoretical insight, we propose HEGNN, a high-degree version of EGNN to increase the expressivity by incorporating high-degree steerable vectors while still maintaining EGNN's advantage through the scalarization trick. Our extensive experiments demonstrate that HEGNN not only aligns with our theoretical analyses on a toy dataset consisting of symmetric structures, but also shows substantial improvements on other complicated datasets without obvious symmetry, including N-body and MD17.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found