Lightweight and Data-Efficient MultivariateTime Series Forecasting using Residual-Stacked Gaussian (RS-GLinear) Architecture

Ali, Abukar

arXiv.org Artificial Intelligence 

-- Following the success of Transformer architectures and their self - attention mechanism in language modelling -- particularly due to their ability to capture long - range dependencies -- many researchers have explored how these architectures can be adopted for time - series forecasting. Varia nts of Transformer - based models have been proposed to handle both short - and long - term sequence modeling, aiming to predict future time - dependent values from historical observations using varying input window sizes. However, despite the popularity of lever a ging Transformer architecture to extract temporal relationships from set of continu ou s datapoints, their performance in time - series forecasting has shown mixed results. Several researchers, including Zeng et al. (2022) and Rizvi et al. (2025), have challenged the reliability of emerging Transformer - based solutions for long - term forecasting tasks. In this research, our first objective is to evaluate the G aussian - based Linear (GLinear) architecture proposed by Ri z vi et al. (2025) and to develop an enhanced ve rsion of it -- referred to in this study as Residual Stacked GLinear (RS - GLinear) model. The second objective is to assess the broader applicability of the RS - GLinear model by extending its use to additional domain -- financial time series and epidemiological data -- which were not explored in the baseline model proposed by Rizvi et al. (2025). Most time - series implementations (Transformer - based and Linear models) we came across commonly adopt baseline codebases provided by the Hugging Face repository, including our baseline GLinear model used in this study. Therefore, the RS - GLinear model developed in this study is an extended version of the codebase introduced in the research by Rizvi et al. (2025) . Keywords -- Multivariate Time Series Forecasting, Transformer - based models, Weather, Influenza - like Illness, Deep Learning, Transformer - based architecture, Residual - Stacked GLinear, Neural - Network. Time series forecasting has been an important research area in many domains such as finance/economics, retail, healthcare, cloud infrastructure, met eo rology, and traffic management (Toner e t al. 2024). Since the introduction of T ransformer Model (Vaswani et al. 2017), there has been large amount of research focusing on time - series forecasting using Large Language Models (LLM) to leverage LLM's sequential dependencies in text generation (Tan et al. 2024).

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found