Lightweight and Data-Efficient MultivariateTime Series Forecasting using Residual-Stacked Gaussian (RS-GLinear) Architecture
–arXiv.org Artificial Intelligence
-- Following the success of Transformer architectures and their self - attention mechanism in language modelling -- particularly due to their ability to capture long - range dependencies -- many researchers have explored how these architectures can be adopted for time - series forecasting. Varia nts of Transformer - based models have been proposed to handle both short - and long - term sequence modeling, aiming to predict future time - dependent values from historical observations using varying input window sizes. However, despite the popularity of lever a ging Transformer architecture to extract temporal relationships from set of continu ou s datapoints, their performance in time - series forecasting has shown mixed results. Several researchers, including Zeng et al. (2022) and Rizvi et al. (2025), have challenged the reliability of emerging Transformer - based solutions for long - term forecasting tasks. In this research, our first objective is to evaluate the G aussian - based Linear (GLinear) architecture proposed by Ri z vi et al. (2025) and to develop an enhanced ve rsion of it -- referred to in this study as Residual Stacked GLinear (RS - GLinear) model. The second objective is to assess the broader applicability of the RS - GLinear model by extending its use to additional domain -- financial time series and epidemiological data -- which were not explored in the baseline model proposed by Rizvi et al. (2025). Most time - series implementations (Transformer - based and Linear models) we came across commonly adopt baseline codebases provided by the Hugging Face repository, including our baseline GLinear model used in this study. Therefore, the RS - GLinear model developed in this study is an extended version of the codebase introduced in the research by Rizvi et al. (2025) . Keywords -- Multivariate Time Series Forecasting, Transformer - based models, Weather, Influenza - like Illness, Deep Learning, Transformer - based architecture, Residual - Stacked GLinear, Neural - Network. Time series forecasting has been an important research area in many domains such as finance/economics, retail, healthcare, cloud infrastructure, met eo rology, and traffic management (Toner e t al. 2024). Since the introduction of T ransformer Model (Vaswani et al. 2017), there has been large amount of research focusing on time - series forecasting using Large Language Models (LLM) to leverage LLM's sequential dependencies in text generation (Tan et al. 2024).
arXiv.org Artificial Intelligence
Oct-7-2025
- Country:
- Asia > China (0.04)
- Europe > Germany (0.04)
- North America > United States
- California > San Francisco County > San Francisco (0.04)
- Pacific Ocean > North Pacific Ocean
- San Francisco Bay (0.04)
- Genre:
- Research Report > New Finding (1.00)
- Industry:
- Health & Medicine
- Epidemiology (1.00)
- Therapeutic Area (1.00)
- Health & Medicine
- Technology: