Structured in Space, Randomized in Time: Leveraging Dropout in RNNs for Efficient Training
–Neural Information Processing Systems
In this work, we identify dropout induced sparsity for LSTMs as a suitable mode of computation reduction. Dropout is a widely used regularization mechanism, which randomly drops computed neuron values during each iteration of training.
Neural Information Processing Systems
Aug-17-2025, 10:42:45 GMT
- Country:
- Europe > Italy
- Calabria > Catanzaro Province > Catanzaro (0.04)
- North America
- Canada (0.04)
- United States > Pennsylvania (0.04)
- Europe > Italy
- Genre:
- Research Report > New Finding (0.46)
- Technology: