deep dynamic poisson factorization model
Deep Dynamic Poisson Factorization Model
A new model, named as deep dynamic poisson factorization model, is proposed in this paper for analyzing sequential count vectors. The model based on the Poisson Factor Analysis method captures dependence among time steps by neural networks, representing the implicit distributions. Local complicated relationship is obtained from local implicit distribution, and deep latent structure is exploited to get the long-time dependence. Variational inference on latent variables and gradient descent based on the loss functions derived from variational distribution is performed in our inference. Synthetic datasets and real-world datasets are applied to the proposed model and our results show good predicting and fitting performance with interpretable latent structure.
Reviews: Deep Dynamic Poisson Factorization Model
This papers introduces the deep dynamic Poisson factorization model, a model that builds on PF to allow for temporal dependencies. In contrast to previous works on dynamic PF, this paper uses a simplified version of a recurrent neural network to allow for long-term dependencies. Inference is carried out via variational inference, with an extra step to maximize over the neural network parameters. The paper reports experiments on 5 real-world datasets. Overall, I found the idea potentially interesting, but I still think the paper needs some improvements in its execution before it can be accepted in a conference like NIPS.
Deep Dynamic Poisson Factorization Model
Gong, Chengyue, huang, win-bin
A new model, named as deep dynamic poisson factorization model, is proposed in this paper for analyzing sequential count vectors. The model based on the Poisson Factor Analysis method captures dependence among time steps by neural networks, representing the implicit distributions. Local complicated relationship is obtained from local implicit distribution, and deep latent structure is exploited to get the long-time dependence. Variational inference on latent variables and gradient descent based on the loss functions derived from variational distribution is performed in our inference. Synthetic datasets and real-world datasets are applied to the proposed model and our results show good predicting and fitting performance with interpretable latent structure.