Optimizing Recurrent Neural Networks in cuDNN 5
Faster forward and backward convolutions using the Winograd convolution algorithm; Improved performance and reduced memory usage with FP16 routines on Pascal GPUs; Support for LSTM recurrent neural networks for sequence learning that deliver up to 6x speedup. Support for LSTM recurrent neural networks for sequence learning that deliver up to 6x speedup. One of the new features we've added in cuDNN 5 is support for Recurrent Neural Networks (RNN). RNNs are a powerful tool used for sequence learning in a number of fields, from speech recognition to image captioning. For a brief high-level introduction to RNNs, LSTM and sequence learning, I recommend you check out Tim Dettmers recent post Deep Learning in a Nutshell: Sequence Learning, and for more depth, Soumith Chintala's post Understanding Natural Language with Deep Neural Networks Using Torch.
Dec-1-2016, 03:45:19 GMT
- Technology: