Appendix A Double-LSTM

Neural Information Processing Systems 

Inspired by the recent work [Dieng et al., 2019] that elucidates how skip-connections promote higher latent information content, we introduce a simple-to-implement modification of a standard LSTM [Hochreiter and Schmidhuber]. Double-LSTM aims to promote the utilization of the latent variable z, whilst increasing the expressive power of the autoregressive decoder. Double-LSTM consists of two LSTM units [Hochreiter and Schmidhuber] as depicted in Figure 5. The first LSTM unit is updated based on the latent variable z and the previous hidden state h. The second LSTM unit is updated based on z, h and the input word embedding w, which is subject to Figure 5: Double-LSTM.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found