Compacting, Picking and Growing for Unforgetting Continual Learning

Hung, Ching-Yi, Tu, Cheng-Hao, Wu, Cheng-En, Chen, Chien-Hung, Chan, Yi-Ming, Chen, Chu-Song

Neural Information Processing Systems 

Continual lifelong learning is essential to many applications. In this paper, we propose a simple but effective approach to continual deep learning. By enforcing their integration in an iterative manner, we introduce an incremental learning method that is scalable to the number of sequential tasks in a continual learning process. Our approach is easy to implement and owns several favorable characteristics. First, it can avoid forgetting (i.e., learn new tasks while remembering all previous tasks).