A decoder-only foundation model for time-series forecasting
Das, Abhimanyu, Kong, Weihao, Sen, Rajat, Zhou, Yichen
–arXiv.org Artificial Intelligence
Motivated by recent advances in large language models for Natural Language Processing (NLP), we design a time-series foundation model for forecasting whose out-of-the-box zero-shot performance on a variety of public datasets comes close to the accuracy of state-of-the-art supervised forecasting models for each individual dataset. Our model is based on pretraining a patched-decoder style attention model on a large time-series corpus, and can work well across different forecasting history lengths, prediction lengths and temporal granularities.
arXiv.org Artificial Intelligence
Feb-4-2024
- Country:
- Genre:
- Research Report (0.82)
- Technology: