SNAP: Low-Latency Test-Time Adaptation with Sparse Updates
Cha, Hyeongheon, Kim, Dong Min, Chung, Hye Won, Gong, Taesik, Lee, Sung-Ju
–arXiv.org Artificial Intelligence
Test-Time Adaptation (TTA) adjusts models using unlabeled test data to handle dynamic distribution shifts. However, existing methods rely on frequent adaptation and high computational cost, making them unsuitable for resource-constrained edge environments. To address this, we propose SNAP, a sparse TTA framework that reduces adaptation frequency and data usage while preserving accuracy. SNAP maintains competitive accuracy even when adapting based on only 1% of the incoming data stream, demonstrating its robustness under infrequent updates. Our method introduces two key components: (i) Class and Domain Representative Memory (CnDRM), which identifies and stores a small set of samples that are representative of both class and domain characteristics to support efficient adaptation with limited data; and (ii) Inference-only Batch-aware Memory Normalization (IoBMN), which dynamically adjusts normalization statistics at inference time by leveraging these representative samples, enabling efficient alignment to shifting target domains. Integrated with five state-of-the-art TTA algorithms, SNAP reduces latency by up to 93.12%, while keeping the accuracy drop below 3.3%, even across adaptation rates ranging from 1% to 50%. This demonstrates its strong potential for practical use on edge devices serving latency-sensitive applications. The source code is available at https://github.com/chahh9808/SNAP.
arXiv.org Artificial Intelligence
Nov-20-2025
- Country:
- Asia > South Korea
- Europe > Switzerland
- North America > United States (0.04)
- Genre:
- Research Report
- Experimental Study (1.00)
- New Finding (1.00)
- Research Report
- Industry:
- Information Technology > Security & Privacy (0.45)
- Technology:
- Information Technology
- Artificial Intelligence
- Machine Learning
- Neural Networks > Deep Learning (0.67)
- Performance Analysis > Accuracy (0.46)
- Statistical Learning (0.92)
- Natural Language (0.67)
- Machine Learning
- Hardware (0.68)
- Artificial Intelligence
- Information Technology