Technical Report: The Graph Spectral Token -- Enhancing Graph Transformers with Spectral Information
–arXiv.org Artificial Intelligence
Graph transformers have demonstrated impressive results compared to conventional Message-Passing Graph Neural Networks (MP-GNNs) in various graph benchmarks. They aim to solve inherent limitations of MP-GNNs, such as the over compression of information, where the recursive neighborhood aggregation can lead to loss of local information, and the under-reaching problem, where the receptive field of nodes is limited by the number of layers [1, 2, 3, 4, 5]. The self-attention mechanism in graph transformers works as a fully-connected graph neural network, allowing for more efficient information exchange. GraphTrans [2] and SubFormer [5] are two similar graph transformer architectures that combine shallow MP-GNN layers for local feature extraction and standard Transformer blocks for global information exchange. However, SubFormer incorporates the molecular coarse-graining assumption [6, 7], which simplifies the graph structure by grouping nodes into substructures, while GraphTrans does not.
arXiv.org Artificial Intelligence
Apr-8-2024
- Country:
- Asia > China
- Europe > United Kingdom
- England > Greater London > London (0.04)
- North America > United States
- Illinois > Cook County > Chicago (0.04)
- Genre:
- Research Report (1.00)
- Industry:
- Health & Medicine > Therapeutic Area (0.34)
- Technology: