Graph contrastive learning. Getting high quality labeled dataset at…
Getting high quality labeled dataset at scale for graph-related problems is often expensive. Graph neural networks tend to overfit small training data sets and fail to learn reusable, task-invariant knowledge. Self-supervised learning has been hugely successful in multiple ML areas and has improved label efficiency. These techniques obtain supervisory signals from the unlabelled data by utilising the data's underlying structure. The goal of graph contrastive learning is to learn a low-dimensional representation to encode the graph's structural and attribute information.
Jan-13-2023, 13:55:07 GMT
- Technology: