Goto

Collaborating Authors

 semi-supervisedlearning


A Flexible Generative Framework for Graph-based Semi-supervised Learning

Jiaqi Ma, Weijing Tang, Ji Zhu, Qiaozhu Mei

Neural Information Processing Systems

We consider a family of problems that are concerned about making predictions for the majority of unlabeled, graph-structured data samples based on a small proportion of labeled samples. Relational information among the data samples, often encoded in the graph/network structure, is shown to be helpful for these semi-supervisedlearningtasks.


GraphStochasticNeuralNetworksfor Semi-supervisedLearning: SupplementalMaterial

Neural Information Processing Systems

Let θ and φ denote the optimal parameters after model training. The detailed statistics of three datasets used in this paper are listed in Table 1. In this paper, when evaluating the performance in the standard experimental scenario and in the label-scarce scenario, we compare with six state-of-the-art baselines used for graph-based semisupervised learning. Three of them are deterministic GNN-based models, which are GCN [1], Graph Attention Networks(GAT)[2]andGraphSAGE[3]respectively.


GraphStochasticNeuralNetworksfor Semi-supervisedLearning

Neural Information Processing Systems

Graph Neural Networks (GNNs) have achieved remarkable performance in the task of the semi-supervised node classification. However,most existing models learn a deterministic classification function, which lack sufficient flexibility to explore better choices in the presence of kinds of imperfect observed data such as the scarce labeled nodes and noisy graph structure.