No Forgetting Learning: Memory-free Continual Learning
Vahedifar, Mohammad Ali, Zhang, Qi
–arXiv.org Artificial Intelligence
Continual Learning (CL) remains a central challenge in deep learning, where models must sequentially acquire new knowledge while mitigating Catastrophic Forgetting (CF) of prior tasks. Existing approaches often struggle with efficiency and scalability, requiring extensive memory or model buffers. This work introduces ``No Forgetting Learning" (NFL), a memory-free CL framework that leverages knowledge distillation to maintain stability while preserving plasticity. Memory-free means the NFL does not rely on any memory buffer. Through extensive evaluations of three benchmark datasets, we demonstrate that NFL achieves competitive performance while utilizing approximately 14.75 times less memory than state-of-the-art methods. Furthermore, we introduce a new metric to better assess CL's plasticity-stability trade-off.
arXiv.org Artificial Intelligence
Mar-7-2025
- Country:
- North America > Canada > Ontario > Toronto (0.14)
- Genre:
- Research Report (1.00)
- Industry:
- Leisure & Entertainment > Sports > Football (0.35)
- Technology: