Hierarchical Graph Transformer with Adaptive Node Sampling
–Neural Information Processing Systems
The Transformer architecture has achieved remarkable success in a number of domains including natural language processing and computer vision. However, when it comes to graph-structured data, transformers have not achieved competitive performance, especially on large graphs. In this paper, we identify the main deficiencies of current graph transformers: (1) Existing node sampling strategies in Graph Transformers are agnostic to the graph characteristics and the training process.
Neural Information Processing Systems
Dec-24-2025, 16:21:55 GMT
- Technology: