Bilevel Continual Learning
Pham, Quang, Sahoo, Doyen, Liu, Chenghao, Hoi, Steven C. H
Continual learning aims to learn continuously from a stream of tasks and data in an online-learning fashion, being capable of exploiting what was learned previously to improve current and future tasks while still being able to perform well on the previous tasks. One common limitation of many existing continual learning methods is that they often train a model directly on all available training data without validation due to the nature of continual learning, thus suffering poor generalization at test time. In this work, we present a novel framework of continual learning named "Bilevel Continual Learning" (BCL) by unifying a bilevel optimization objective and a dual memory management strategy comprising both episodic memory and generalization memory to achieve effective knowledge transfer to future tasks and alleviate catastrophic forgetting on old tasks simultaneously. Our extensive experiments on continual learning benchmarks demonstrate the efficacy of the proposed BCL compared to many state-of-the-art methods. Unlike humans, conventional machine learning methods, particularly neural networks, struggle to learn continuously because these models lose their abilities to perform acquired skills when they learn a new task (French, 1999). Continual learning systems are specifically designed to learn continuously from a stream of tasks. They are able to accumulate knowledge over time to improve the future learning outcome, while still being able to perform well on the previous tasks.
Jul-30-2020
- Genre:
- Research Report
- New Finding (0.46)
- Promising Solution (0.34)
- Research Report
- Industry:
- Education > Educational Setting > Online (0.67)
- Technology: