Self-Distilled Representation Learning for Time Series
Pieper, Felix, Ditschuneit, Konstantin, Genzel, Martin, Lindt, Alexandra, Otterbach, Johannes
–arXiv.org Artificial Intelligence
Self-supervised learning for time-series data holds potential similar to that recently unleashed in Natural Language Processing and Computer Vision. While most existing works in this area focus on contrastive learning, we propose a conceptually simple yet powerful non-contrastive approach, based on the data2vec self-distillation framework. The core of our method is a student-teacher scheme that predicts the latent representation of an input time series from masked views of the same time series. This strategy avoids strong modality-specific assumptions and biases typically introduced by the design of contrastive sample pairs. We demonstrate the competitiveness of our approach for classification and forecasting as downstream tasks, comparing with state-of-the-art self-supervised learning methods on the UCR and UEA archives as well as the ETT and Electricity datasets.
arXiv.org Artificial Intelligence
Nov-19-2023
- Genre:
- Research Report > New Finding (0.68)
- Industry:
- Energy (0.46)
- Information Technology (0.46)
- Technology: