A Joint Time-frequency Domain Transformer for Multivariate Time Series Forecasting
Chen, Yushu, Liu, Shengzhuo, Yang, Jinzhe, Jing, Hao, Zhao, Wenlai, Yang, Guangwen
–arXiv.org Artificial Intelligence
It has broad applications including but not limited to climatology, energy, finance, trading, and logistics (Petropoulos et al., 2022). Following the great success of Transformers (Vaswani et al., 2017) in NLP (Kalyan et al., 2021), CV (Khan et al., 2021), and speech (Karita et al., 2019), Transformers have been introduced in time series forecasting and achieves promising results (Wen et al., 2022). One of the primary drawbacks of Transformers is their quadratic complexity in both computation and memory, making them less suitable for long-term forecasting. To address this limitation, a plethora of Transformer-based models, e.g., LogTrans, Informer, AutoFormer, Performer, and PyraFormer (Li et al., 2019; Zhou et al., 2021; Wu et al., 2021; Choromanski et al., 2021; Liu et al., 2022a), have been proposed to enhance predictive performance while maintaining low complexity. Notably, Zhou et al. (2022b) observed that most time series which are dense in the time domain (TD) tend to have a sparse representation in the frequency domain (FD).
arXiv.org Artificial Intelligence
Oct-28-2023
- Country:
- Asia > China (0.46)
- North America > United States
- California (0.28)
- Genre:
- Research Report (1.00)
- Industry:
- Government (0.67)
- Health & Medicine (1.00)
- Technology: