Reviews: A Theoretically Grounded Application of Dropout in Recurrent Neural Networks
–Neural Information Processing Systems
The paper gives a Bayesian interpretation of a potential dropout-like technique for training recurrent neural nets. This research direction is good for its potential influences on theoretical aspect for understanding both RNN and dropout. However, under the proposed framework, the reason why this technique is useful is still a bit unclear. In particular, the paper proposes a mixture prior on the row of weights (line 129), without explaining the benefit of doing so, besides resulting in an interpretation of dropout. Also, much of the interpretation is a simple extension of the previously proposed interpretation of dropout in feedforward case (Gal and Ghahramani, ICML2016) and can hardly be considered as a significant novelty, and the empirical novelty is also diminished because of the previous paper from Moon et al.
Neural Information Processing Systems
Jan-20-2025, 05:38:38 GMT
- Technology: