APT: Affine Prototype-Timestamp For Time Series Forecasting Under Distribution Shift
Li, Yujie, Shao, Zezhi, Yu, Chengqing, Fu, Yisong, Sun, Tao, Xu, Yongjun, Wang, Fei
–arXiv.org Artificial Intelligence
Time series forecasting under distribution shift remains challenging, as existing deep learning models often rely on local statistical normalization (e.g., mean and variance) that fails to capture global distribution shift. Methods like RevIN and its variants attempt to decouple distribution and pattern but still struggle with missing values, noisy observations, and invalid channel-wise affine transformation. To address these limitations, we propose Affine Prototype-Timestamp (APT), a lightweight and flexible plug-in module that injects global distribution features into the normalization-forecasting pipeline. By leveraging timestamp-conditioned prototype learning, APT dynamically generates affine parameters that modulate both input and output series, enabling the backbone to learn from self-supervised, distribution-aware clustered instances. APT is compatible with arbitrary forecasting backbones and normalization strategies while introducing minimal computational overhead. Extensive experiments across six benchmark datasets and multiple backbone-normalization combinations demonstrate that APT significantly improves forecasting performance under distribution shift.
arXiv.org Artificial Intelligence
Nov-18-2025
- Country:
- Asia
- Europe
- Switzerland (0.04)
- United Kingdom (0.04)
- North America
- Canada > British Columbia
- Vancouver (0.04)
- United States > California
- San Francisco County > San Francisco (0.04)
- Canada > British Columbia
- Oceania
- Australia (0.04)
- New Zealand (0.04)
- Pacific Ocean > North Pacific Ocean
- San Francisco Bay (0.04)
- Genre:
- Research Report (0.81)
- Industry:
- Health & Medicine > Therapeutic Area (0.46)
- Technology: