HGTS-Former: Hierarchical HyperGraph Transformer for Multivariate Time Series Analysis
Wang, Xiao, Si, Hao, Zhang, Fan, Zhou, Xiaoya, Sun, Dengdi, Lyu, Wanli, Yang, Qingquan, Tang, Jin
–arXiv.org Artificial Intelligence
Multivariate time series analysis has long been one of the key research topics in the field of artificial intelligence. However, analyzing complex time series data remains a challenging and unresolved problem due to its high dimensionality, dynamic nature, and complex interactions among variables. Inspired by the strong structural modeling capability of hypergraphs, this paper proposes a novel hypergraph-based time series transformer backbone network, termed HGTS-Former, to address the multivariate coupling in time series data. Specifically, given the multivariate time series signal, we first normalize and embed each patch into tokens. Then, we adopt the multi-head self-attention to enhance the temporal representation of each patch. The hierarchical hypergraphs are constructed to aggregate the temporal patterns within each channel and fine-grained relations between different variables. After that, we convert the hyperedge into node features through the EdgeToNode module and adopt the feed-forward network to further enhance the output features. Extensive experiments conducted on two multivariate time series tasks and eight datasets fully validated the effectiveness of our proposed HGTS-Former. The source code will be released on https://github.com/Event-AHU/Time_Series_Analysis.
arXiv.org Artificial Intelligence
Aug-5-2025
- Country:
- Asia > China
- Anhui Province > Hefei (0.04)
- North America > Trinidad and Tobago
- Asia > China
- Genre:
- Research Report (1.00)
- Technology: