Recurrent Neural Networks and Long Short-Term Memory Networks: Tutorial and Survey
Ghojogh, Benyamin, Ghodsi, Ali
–arXiv.org Artificial Intelligence
Several solutions This is a tutorial paper on Recurrent Neural Network were proposed for this issue, some of which are close-toidentity (RNN), Long Short-Term Memory Network weight matrix (Mikolov et al., 2015), long delays (LSTM), and their variants. We start with a (Lin et al., 1995), leaky units (Jaeger et al., 2007; Sutskever dynamical system and backpropagation through & Hinton, 2010), and echo state networks (Jaeger & Haas, time for RNN. Then, we discuss the problems 2004; Jaeger, 2007). of gradient vanishing and explosion in longterm dependencies. We explain close-to-identity Sequence modeling requires both short-term and long-term weight matrix, long delays, leaky units, and echo dependencies. For example, consider the sentence "The state networks for solving this problem. Then, police is chasing the thief".
arXiv.org Artificial Intelligence
Apr-22-2023
- Country:
- Europe (0.46)
- Genre:
- Instructional Material > Course Syllabus & Notes (0.66)
- Research Report (0.50)
- Industry:
- Technology: