Look-Ahead Selective Plasticity for Continual Learning of Visual Tasks

Meshkinnejad, Rouzbeh, Mei, Jie, Lizotte, Daniel, Mohsenzadeh, Yalda

arXiv.org Artificial Intelligence 

Contrastive representation learning has emerged as a promising technique for continual learning as it can learn representations that are robust to catastrophic forgetting and generalize well to unseen future tasks. Previous work in continual learning has addressed forgetting by using previous task data and trained models. Inspired by event models created and updated in the brain, we propose a new mechanism that takes place during task boundaries, i.e., when one task finishes and another starts. By observing the redundancy-inducing ability of contrastive loss on the output of a neural network, our method leverages the first few samples of the new task to identify and retain parameters contributing most to the transfer ability of the neural network, freeing up the remaining parts of the network to learn new features. We evaluate the proposed methods on benchmark computer vision datasets including CIFAR10 and TinyImagenet and demonstrate state-of-the-art performance in the task-incremental, class-incremental, and domain-incremental continual learning scenarios. Deep neural networks (DNNs) have been solving a variety of computer vision tasks with high performance. While this feat has been achieved via access to large and diverse datasets, in many practical scenarios data is not available in its entirety at first and becomes available over time, potentially including new unseen classes and different target distributions. When presented with a sequence of classification tasks to learn and remember, DNNs suffer from a well-known catastrophic forgetting problem (McCloskey & Cohen, 1989), losing their performance on previous classification datasets abruptly. To address this issue, various continual learning algorithms have been proposed.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found