Cost-Efficient Continual Learning with Sufficient Exemplar Memory
Cho, Dongkyu, Moon, Taesup, Chunara, Rumi, Cho, Kyunghyun, Cha, Sungmin
–arXiv.org Artificial Intelligence
Continual learning (CL) research typically assumes highly constrained exemplar memory resources. However, in many real-world scenarios--especially in the era of large foundation models--memory is abundant, while GPU computational costs are the primary bottleneck. In this work, we investigate CL in a novel setting where exemplar memory is ample (i.e., sufficient exemplar memory). Unlike prior methods designed for strict exemplar memory constraints, we propose a simple yet effective approach that directly operates in the model's weight space through a combination of weight resetting and averaging techniques. Our method achieves state-of-the-art performance while reducing the computational cost to a quarter or third of existing methods. These findings challenge conventional CL assumptions and provide a practical baseline for computationally efficient CL applications. Continual learning (CL) has attracted significant attention as a paradigm enabling machine learning models to adapt to sequential tasks while overcoming catastrophic forgetting of previously acquired knowledge (Wang et al., 2024).
arXiv.org Artificial Intelligence
Feb-11-2025