PRISM: Lightweight Multivariate Time-Series Classification through Symmetric Multi-Resolution Convolutional Layers
Zucchi, Federico, Lampert, Thomas
–arXiv.org Artificial Intelligence
Multivariate time series classification supports applications from wearable sensing to biomedical monitoring and demands models that can capture both short-term patterns and longer-range temporal dependencies. Despite recent advances, Transformer and CNN models often remain computationally heavy and rely on many parameters. This work presents PRISM (Per-channel Resolution Informed Symmetric Module), a lightweight fully convolutional classifier. Operating in a channel-independent manner, in its early stage it applies a set of multi-resolution symmetric convolutional filters. This symmetry enforces structural constraints inspired by linear-phase FIR filters from classical signal processing, effectively halving the number of learnable parameters within the initial layers while preserving the full receptive field. Across the diverse UEA multivariate time-series archive as well as specific benchmarks in human activity recognition, sleep staging, and biomedical signals, PRISM matches or outperforms state-of-the-art CNN and Transformer models while using significantly fewer parameters and markedly lower computational cost. By bringing a principled signal processing prior into a modern neural architecture, PRISM offers an effective and computationally economical solution for multivariate time series classification.1. Introduction Multivariate time series, characterised by intricate temporal dependencies, are common in finance, healthcare, environmental science, and human activity recognition. Deep learning has improved analysis and classification for such data, yet state-of-the-art models often incur high computational cost, heavy pa-rameterisation, and limited robustness in realistic data regimes. Transformer architectures, adapted from NLP for long-range dependencies, have been applied to time series. Despite promising results, their extensive parameter counts can lead to overfitting and high memory use [1]. In practice, self-attention can struggle with noisy, redundant signals [2, 3].
arXiv.org Artificial Intelligence
Dec-10-2025
- Country:
- Europe
- France > Grand Est
- Bas-Rhin > Strasbourg (0.04)
- United Kingdom > England (0.04)
- France > Grand Est
- North America > United States (0.14)
- Europe
- Genre:
- Research Report > New Finding (0.93)
- Industry:
- Technology: