Supra-Laplacian Encoding for Transformer on Dynamic Graphs Marc Lafon Conservatoire National des Arts et Métiers Conservatoire National des Arts et Métiers CEDRIC, EA4629
–Neural Information Processing Systems
Fully connected Graph Transformers (GT) have rapidly become prominent in the static graph community as an alternative to Message-Passing models, which suffer from a lack of expressivity, oversquashing, and under-reaching. However, in a dynamic context, by interconnecting all nodes at multiple snapshots with self-attention,GT loose both structural and temporal information. In this work, we introduce Supra-LAplacian encoding for spatio-temporal TransformErs (SLATE), a new spatio-temporal encoding to leverage the GT architecture while keeping spatio-temporal information. Specifically, we transform Discrete Time Dynamic Graphs into multi-layer graphs and take advantage of the spectral properties of their associated supra-Laplacian matrix.
Neural Information Processing Systems
May-28-2025, 17:12:10 GMT
- Country:
- Europe (0.68)
- North America > United States (0.92)
- Genre:
- Overview (0.92)
- Research Report
- Experimental Study (0.93)
- New Finding (0.67)
- Promising Solution (0.67)
- Industry:
- Government (1.00)
- Technology:
- Information Technology
- Artificial Intelligence
- Machine Learning
- Neural Networks > Deep Learning (1.00)
- Performance Analysis (0.67)
- Statistical Learning (0.67)
- Natural Language (1.00)
- Representation & Reasoning (1.00)
- Machine Learning
- Data Science > Data Mining (1.00)
- Information Management (0.69)
- Artificial Intelligence
- Information Technology