Review for NeurIPS paper: A Discrete Variational Recurrent Topic Model without the Reparametrization Trick
–Neural Information Processing Systems
Summary and Contributions: In this paper, the authors attempted to utilize neural variational inference to construct a neural topic model with discrete random variables, and proposed one model, namely VRTM, which combine1. The exploration of combining RNNs and topic models is interesting and significant, which can help topic models to handle sequence text and capture more text information than the bag-of-word model, which is prevalently utilized in LDA-based topic models. Specifically, when facing the thematic words, VRTM uses both the RNN and topic model predications to generative the next word; however, when facing the syntactic words, only the output of the RNN is utilized to predict the next word. In particular, during the generative process, the discrete topic assignment has been attached to each thematic word, which is beneficial for the Interpretability. To be specific, the authors first designed one reasonable generative model, which can apply different strategies for generating thematic and syntactic words with different inputs, i.e., a mixture of LDA and RNN predications or just the output of the RNN.
Neural Information Processing Systems
Jan-27-2025, 02:40:06 GMT
- Technology: