sdrem
- Asia > China > Shanghai > Shanghai (0.04)
- Oceania > Australia > New South Wales (0.04)
- North America > Canada (0.04)
- Asia > Middle East > Jordan (0.04)
- Information Technology > Communications (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.70)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.47)
Scalable Deep Generative Relational Model with High-Order Node Dependence
Xuhui Fan, Bin Li, Caoyuan Li, Scott SIsson, Ling Chen
We propose a probabilistic framework for modelling and exploring the latent structure of relational data. Given feature information for the nodes in a network, the scalable deep generative relational model (SDREM) builds a deep network architecture that can approximate potential nonlinear mappings between nodes' feature information and the nodes' latent representations. Our contribution is two-fold: (1) We incorporate high-order neighbourhood structure information to generate the latent representations at each node, which vary smoothly over the network.
- Asia > China > Shanghai > Shanghai (0.04)
- Oceania > Australia > New South Wales (0.04)
- North America > Canada (0.04)
- Asia > Middle East > Jordan (0.04)
Reviews: Scalable Deep Generative Relational Model with High-Order Node Dependence
First of all, I don't think it's a good way to represent each node with a dirichlet distribution leading to a positive node embedding. It's quite different from traditional real-valued embedding methods and I assume positive embedding representations will directly reduce semantic information compared to real-valued. So if there are any other positive embedding methods, please refer them to illustrate the relation to the proposed method. As mentioned in the article, the proposed SDREM propagating information through neighbors works in a similar spirit to the spatial graph convolutional network (GCN) in a frequentist setting. But as far as I am concerned, GCNs that have already been applied, will not only consider neighboring information in graphs, but also propagate each node embedding to a deeper representation through a fully connected network.
Scalable Deep Generative Relational Models with High-Order Node Dependence
Fan, Xuhui, Li, Bin, Sisson, Scott Anthony, Li, Caoyuan, Chen, Ling
We propose a probabilistic framework for modelling and exploring the latent structure of relational data. Given feature information for the nodes in a network, the scalable deep generative relational model (SDREM) builds a deep network architecture that can approximate potential nonlinear mappings between nodes' feature information and the nodes' latent representations. Our contribution is two-fold: (1) We incorporate high-order neighbourhood structure information to generate the latent representations at each node, which vary smoothly over the network. (2) Due to the Dirichlet random variable structure of the latent representations, we introduce a novel data augmentation trick which permits efficient Gibbs sampling. The SDREM can be used for large sparse networks as its computational cost scales with the number of positive links. We demonstrate its competitive performance through improved link prediction performance on a range of real-world datasets.
- Asia > China > Shanghai > Shanghai (0.04)
- Oceania > Australia > New South Wales (0.04)
- North America > Canada (0.04)
- Asia > Middle East > Jordan (0.04)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Communications > Networks (0.90)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.89)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.67)