Particle Filter Recurrent Neural Networks
Ma, Xiao, Karkus, Peter, Hsu, David, Lee, Wee Sun
Recurrent neural networks (RNNs) have been extraordinarily successful for prediction with sequential data. To tackle highly variable and noisy real-world data, we introduce Particle Filter Recurrent Neural Networks (PF-RNNs), a new RNN family that explicitly models uncertainty in its internal structure: while an RNN relies on a long, deterministic latent state vector, a PF-RNN maintains a latent state distribution, approximated as a set of particles. For effective learning, we provide a fully differentiable particle filter algorithm that updates the PF-RNN latent state distribution according to the Bayes rule. Experiments demonstrate that the proposed PF-RNNs outperform the corresponding standard gated RNNs on a synthetic robot localization dataset and 10 real-world sequence prediction datasets for text classification, stock price prediction, etc.
May-30-2019
- Country:
- Asia (0.14)
- North America > United States
- California (0.14)
- Genre:
- Research Report > New Finding (0.46)
- Industry:
- Banking & Finance > Trading (0.34)