AutoHFormer: Efficient Hierarchical Autoregressive Transformer for Time Series Prediction

Zhang, Qianru, Wen, Honggang, Li, Ming, Huang, Dong, Yiu, Siu-Ming, Jensen, Christian S., Liò, Pietro

arXiv.org Artificial Intelligence 

Abstract--Time series forecasting requires architectures that simultaneously achieve three competing objectives: (1) strict temporal causality for reliable predictions, (2) sub-quadratic complexity for practical scalability, and (3) multi-scale pattern recognition for accurate long-horizon forecasting. We introduce AutoHFormer, a hierarchical autoregressive transformer that addresses these challenges through three key innovations: 1) Hierarchical Temporal Modeling: Our architecture decomposes predictions into segment-level blocks processed in parallel, followed by intra-segment sequential refinement. This dual-scale approach maintains temporal coherence while enabling efficient computation. This design avoids both the anti-causal violations of standard transformers and the sequential bottlenecks of RNN hybrids. It combines fixed oscillating patterns for short-term variations with learnable decay rates for long-term trends. Comprehensive experiments demonstrate that AutoHFormer 10.76 faster training and 6.06 memory reduction compared to PatchTST on PEMS08, while maintaining consistent accuracy across 96-720 step horizons in most of cases. These breakthroughs establish new benchmarks for efficient and precise time series modeling. I. Introduction Time series forecasting [1, 2, 3, 4, 5] stands as a fundamental pillar of modern predictive analytics, enabling data-driven decision making across numerous mission-critical domains. As demonstrated in recent literature [6, 7], this task has become increasingly vital in our data-rich era.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found