high-order node dependence
Scalable Deep Generative Relational Model with High-Order Node Dependence
In this work, we propose a probabilistic framework for relational data modelling and latent structure exploring. Given the possible feature information for the nodes in a network, our model builds up a deep architecture that can approximate to the possible nonlinear mappings between the nodes' feature information and latent representations. For each node, we incorporate all its neighborhoods' high-order structure information to generate latent representation, such that these latent representations are ``smooth'' in terms of the network. Since the latent representations are generated from Dirichlet distributions, we further develop a data augmentation trick to enable efficient Gibbs sampling for Ber-Poisson likelihood with Dirichlet random variables. Our model can be ready to apply to large sparse network as its computations cost scales to the number of positive links in the networks. The superior performance of our model is demonstrated through improved link prediction performance on a range of real-world datasets.
Reviews: Scalable Deep Generative Relational Model with High-Order Node Dependence
First of all, I don't think it's a good way to represent each node with a dirichlet distribution leading to a positive node embedding. It's quite different from traditional real-valued embedding methods and I assume positive embedding representations will directly reduce semantic information compared to real-valued. So if there are any other positive embedding methods, please refer them to illustrate the relation to the proposed method. As mentioned in the article, the proposed SDREM propagating information through neighbors works in a similar spirit to the spatial graph convolutional network (GCN) in a frequentist setting. But as far as I am concerned, GCNs that have already been applied, will not only consider neighboring information in graphs, but also propagate each node embedding to a deeper representation through a fully connected network.
Reviews: Scalable Deep Generative Relational Model with High-Order Node Dependence
The paper was reviewed by three experts in the field. The reviewers and AC all agree that the paper contains novel contributions, but share the same opinion that it could be strengthened by addressing the reviewers' comments. In addition to the reviewers' comments such as the need to adding comparison with VGAE and its variates, the AC would like to provide some additional feedback to the authors: The AC views the paper as some kind of smart combination of edge partition model, gamma belief net, and Dirichlet belief net, enhanced by adding covariate dependence and by incorporate the network information in learning the connection weights of the Dirichlet belief net. Pros: 1) the combination is non-trival: replacing the gamma weights in edge partition model with latent counts is the key to allow closed-form Gibbs sampling (upward latent count propagation followed by downward variable sampling). How the X is used in (3) and sampled in (5) is novel.
Scalable Deep Generative Relational Model with High-Order Node Dependence
In this work, we propose a probabilistic framework for relational data modelling and latent structure exploring. Given the possible feature information for the nodes in a network, our model builds up a deep architecture that can approximate to the possible nonlinear mappings between the nodes' feature information and latent representations. For each node, we incorporate all its neighborhoods' high-order structure information to generate latent representation, such that these latent representations are smooth'' in terms of the network. Since the latent representations are generated from Dirichlet distributions, we further develop a data augmentation trick to enable efficient Gibbs sampling for Ber-Poisson likelihood with Dirichlet random variables. Our model can be ready to apply to large sparse network as its computations cost scales to the number of positive links in the networks.
Scalable Deep Generative Relational Model with High-Order Node Dependence
Fan, Xuhui, Li, Bin, Li, Caoyuan, SIsson, Scott, Chen, Ling
In this work, we propose a probabilistic framework for relational data modelling and latent structure exploring. Given the possible feature information for the nodes in a network, our model builds up a deep architecture that can approximate to the possible nonlinear mappings between the nodes' feature information and latent representations. For each node, we incorporate all its neighborhoods' high-order structure information to generate latent representation, such that these latent representations are smooth'' in terms of the network. Since the latent representations are generated from Dirichlet distributions, we further develop a data augmentation trick to enable efficient Gibbs sampling for Ber-Poisson likelihood with Dirichlet random variables. Our model can be ready to apply to large sparse network as its computations cost scales to the number of positive links in the networks.