A Theoretically Grounded Application of Dropout in Recurrent Neural Networks
–Neural Information Processing Systems
Recurrent neural networks (RNNs) stand at the forefront of many recent developments in deep learning. Yet a major difficulty with these models is their tendency to overfit, with dropout shown to fail when applied to recurrent layers.
Neural Information Processing Systems
Mar-12-2024, 07:16:18 GMT
- Country:
- Asia > Middle East
- Europe
- Spain > Catalonia
- Barcelona Province > Barcelona (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Spain > Catalonia
- North America > Canada
- Genre:
- Research Report > New Finding (0.46)
- Technology: