FreDF: Learning to Forecast in Frequency Domain

Wang, Hao, Pan, Licheng, Chen, Zhichao, Yang, Degui, Zhang, Sen, Yang, Yifei, Liu, Xinggao, Li, Haoxuan, Tao, Dacheng

arXiv.org Artificial Intelligence 

Time series modeling aims to encode historical sequence to predict future data, which is crucial in diverse applications: long-term forecast in weather prediction [3, 40], short-term prediction in industrial maintenance [24, 7, 35], and missing data imputation in healthcare [30]. A key challenge in time series modeling, distinguishing it from canonical regression tasks, is the presence of autocorrelation. It refers to the dependence between time steps, which exists in both the input and label sequences. To accommodate autocorrelation in input sequences, diverse forecast models have been developed [28, 5, 8], exemplified by recurrent [29], convolution [37] and graph neural networks [25, 4, 11]. Recently, Transformer-based models, utilizing self-attention mechanisms to dynamically assess autocorrelation, have gained prominence in this line of work [20, 26, 13, 38]. Concurrently, there is a growing trend of incorporating frequency analysis into forecast models [41, 21].