Technical Report: The Graph Spectral Token -- Enhancing Graph Transformers with Spectral Information

Pengmei, Zihan, Li, Zimu

arXiv.org Artificial Intelligence 

Graph transformers have demonstrated impressive results compared to conventional Message-Passing Graph Neural Networks (MP-GNNs) in various graph benchmarks. They aim to solve inherent limitations of MP-GNNs, such as the over compression of information, where the recursive neighborhood aggregation can lead to loss of local information, and the under-reaching problem, where the receptive field of nodes is limited by the number of layers [1, 2, 3, 4, 5]. The self-attention mechanism in graph transformers works as a fully-connected graph neural network, allowing for more efficient information exchange. GraphTrans [2] and SubFormer [5] are two similar graph transformer architectures that combine shallow MP-GNN layers for local feature extraction and standard Transformer blocks for global information exchange. However, SubFormer incorporates the molecular coarse-graining assumption [6, 7], which simplifies the graph structure by grouping nodes into substructures, while GraphTrans does not.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found