TimePFN: Effective Multivariate Time Series Forecasting with Synthetic Data
Taga, Ege Onur, Ildiz, M. Emrullah, Oymak, Samet
–arXiv.org Artificial Intelligence
The diversity of time series applications and scarcity of domain-specific data highlight the need for time-series models with strong few-shot learning capabilities. In this work, we propose a novel training scheme and a transformer-based architecture, collectively referred to as TimePFN, for multivariate time-series (MTS) forecasting. TimePFN is based on the concept of Prior-data Fitted Networks (PFN), which aims to approximate Bayesian inference. Our approach consists of (1) generating synthetic MTS data through diverse Gaussian process kernels and the linear coregionalization method, and (2) a novel MTS architecture capable of utilizing both temporal and cross-channel dependencies across all input patches. We evaluate TimePFN on several benchmark datasets and demonstrate that it outperforms the existing state-of-the-art models for MTS forecasting in both zero-shot and few-shot settings. Notably, fine-tuning TimePFN with as few as 500 data points nearly matches full dataset training error, and even 50 data points yield competitive results. We also find that TimePFN exhibits strong univariate forecasting performance, attesting to its generalization ability. Overall, this work unlocks the power of synthetic data priors for MTS forecasting and facilitates strong zero-and few-shot forecasting performance. Code -- https://github.com/egetaga/TimePFN 1 Introduction Natural language processing has achieved remarkable success driven by advances in neural architectures and data pipelines. These advances underlie modern language and vision-language models that exhibit remarkable zero-shot and few-shot learning capabilities. Inspired by these, researchers have started exploring whether such methods and ideas could be extended to time series forecasting. For instance, a notable line of work (Zhou et al. 2021; Wu et al. 2021; Zhou et al. 2022; Zhang and Y an 2023) examine the use of transformer architecture (V aswani et al. 2017) in time-series forecasting. More recently, there is also a push toward building foundation models for time series tasks (Ansari et al. 2024). However, the heterogeneous nature of time series data brings additional complications. As shown by (Zeng et al. 2023), even simple linear models are shown toTo appear in AAAI-2025 as a conference paper.
arXiv.org Artificial Intelligence
Feb-22-2025
- Country:
- North America > United States > Michigan (0.14)
- Genre:
- Research Report > Promising Solution (0.34)
- Industry:
- Energy > Power Industry (0.68)
- Technology: