Goto

Collaborating Authors

 timemcl


TimePre: Bridging Accuracy, Efficiency, and Stability in Probabilistic Time-Series Forecasting

Jiang, Lingyu, Xu, Lingyu, Li, Peiran, Ge, Qianwen, Zhuang, Dingyi, Xing, Shuo, Chen, Wenjing, Gao, Xiangbo, Chen, Ting-Hsuan, Zhan, Xueying, Zhang, Xin, Zhang, Ziming, Tu, Zhengzhong, Zielewski, Michael, Yamada, Kazunori, Lin, Fangzhou

arXiv.org Artificial Intelligence

Probabilistic Time-Series Forecasting (PTSF) is critical for uncertainty-aware decision making, but existing generative models, such as diffusion-based approaches, are computationally prohibitive due to expensive iterative sampling. Non-sampling frameworks like Multiple Choice Learning (MCL) offer an efficient alternative, but suffer from severe training instability and hypothesis collapse, which has historically hindered their performance. This problem is dramatically exacerbated when attempting to combine them with modern, efficient MLP-based backbones. To resolve this fundamental incompatibility, we propose TimePre, a novel framework that successfully unifies the efficiency of MLP-based models with the distributional flexibility of the MCL paradigm. The core of our solution is Stabilized Instance Normalization (SIN), a novel normalization layer that explicitly remedies this incompatibility. SIN stabilizes the hybrid architecture by correcting channel-wise statistical shifts, definitively resolving the catastrophic hypothesis collapse. Extensive experiments on six benchmark datasets demonstrate that TimePre achieves new state-of-the-art accuracy on key probabilistic metrics. Critically, TimePre achieves inference speeds orders of magnitude faster than sampling-based models and, unlike prior MCL work, demonstrates stable performance scaling. It thus bridges the long-standing gap between accuracy, efficiency, and stability in probabilistic forecasting.


Winner-takes-all for Multivariate Probabilistic Time Series Forecasting

Cortés, Adrien, Rehm, Rémi, Letzelter, Victor

arXiv.org Machine Learning

We introduce TimeMCL, a method leveraging the Multiple Choice Learning (MCL) paradigm to forecast multiple plausible time series futures. Our approach employs a neural network with multiple heads and utilizes the Winner-Takes-All (WTA) loss to promote diversity among predictions. MCL has recently gained attention due to its simplicity and ability to address ill-posed and ambiguous tasks. We propose an adaptation of this framework for time-series forecasting, presenting it as an efficient method to predict diverse futures, which we relate to its implicit quantization objective. We provide insights into our approach using synthetic data and evaluate it on real-world time series, demonstrating its promising performance at a light computational cost.