Convolutional State Space Models for Long-Range Spatiotemporal Modeling
–Neural Information Processing Systems
Effectively modeling long spatiotemporal sequences is challenging due to the need to model complex spatial correlations and long-range temporal dependencies simultaneously. ConvLSTMs attempt to address this by updating tensor-valued states with recurrent neural networks, but their sequential computation makes them slow to train. In contrast, Transformers can process an entire spatiotemporal sequence, compressed into tokens, in parallel.
Neural Information Processing Systems
Dec-27-2025, 07:52:40 GMT
- Technology: