DB2-TransF: All You Need Is Learnable Daubechies Wavelets for Time Series Forecasting
Gupta, Moulik, Tripathi, Achyut Mani
–arXiv.org Artificial Intelligence
Model Category Key Characteristics iTransformer [11] Transformer-based Processes each variate independently prior to multivariate fusion and is regarded as the current state-of-the-art (SOTA) in time series forecasting. PatchTST [42] Transformer-based Divides the time series into patches and applies channel-independent shared embeddings and weights for feature extraction. Crossformer [35] Transformer-based Utilizes cross-attention mechanisms to effectively capture long-range dependencies across temporal sequences.FEDformer [43] Transformer-based Improves Transformer performance by leveraging frequency-domain sparsity, typically through Fourier transforms. Autoformer [33] Transformer-based Employs a decomposition-based architecture combined with an auto-correlation mechanism for effective time series modeling. RLinear [44] Linear-based A state-of-the-art linear model that incorporates reversible normalization and assumes channel independence.TiDE [45] Linear-based An encoder-decoder architecture built entirely using multi-layer perceptrons (MLPs). DLinear [46] Linear-based Among the earliest linear models for time series forecasting, utilizing a single-layer architecture combined with series decomposition. TimesNet [28] Temporal Conv-based Employs 2D convolutional kernels (TimesBlock) to model both intra-period and inter-period variations in time series data.
arXiv.org Artificial Intelligence
Dec-12-2025
- Country:
- Genre:
- Research Report (1.00)
- Industry:
- Energy > Power Industry (0.46)
- Technology: