Subgraph Gaussian Embedding Contrast for Self-Supervised Graph Representation Learning
Xie, Shifeng, Einizade, Aref, Giraldo, Jhony H.
–arXiv.org Artificial Intelligence
Graph Representation Learning (GRL) is a fundamental task in machine learning, aiming to encode high-dimensional graph-structured data into low-dimensional vectors. Self-Supervised Learning (SSL) methods are widely used in GRL because they can avoid expensive human annotation. In this work, we propose a novel Subgraph Gaussian Embedding Contrast (SubGEC) method. Our approach introduces a sub-graph Gaussian embedding module, which adaptively maps subgraphs to a structured Gaussian space, ensuring the preservation of input sub-graph characteristics while generating subgraphs with a controlled distribution. We then employ optimal transport distances, more precisely the Wasserstein and Gromov-Wasserstein distances, to effectively measure the similarity between subgraphs, enhancing the robustness of the contrastive learning process. Extensive experiments across multiple benchmarks demonstrate that SubGEC outperforms or presents competitive performance against state-of-the-art approaches. Our findings provide insights into the design of SSL methods for GRL, emphasizing the importance of the distribution of the generated contrastive pairs.
arXiv.org Artificial Intelligence
Jun-13-2025
- Country:
- Europe > France (0.04)
- North America > United States
- Texas (0.04)
- Genre:
- Research Report
- New Finding (0.66)
- Promising Solution (0.66)
- Research Report
- Technology: