Linear Mode Connectivity in Multitask and Continual Learning

Mirzadeh, Seyed Iman, Farajtabar, Mehrdad, Gorur, Dilan, Pascanu, Razvan, Ghasemzadeh, Hassan

arXiv.org Artificial Intelligence 

Continual (sequential) training and multitask (simultaneous) training are often attempting to solve the same overall objective: to find a solution that performs well on all considered tasks. The main difference is in the training regimes, where continual learning can only have access to one task at a time, which for neural networks typically leads to catastrophic forgetting. That is, the solution found for a subsequent task does not perform well on the previous ones anymore. However, the relationship between the different minima that the two training regimes arrive at is not well understood. Is there a local structure that could explain the difference in performance achieved by the two different schemes? Motivated by recent work showing that different minima of the same task are typically connected by very simple curves of low error, we investigate whether multitask and continual solutions are similarly connected. We empirically find that indeed such connectivity can be reliably achieved and, more interestingly, it can be done by a linear path, conditioned on having the same initialization for both. We thoroughly analyze this observation and discuss its significance for the continual learning process. Furthermore, we exploit this finding to propose an effective algorithm that constrains the sequentially learned minima to behave as the multitask solution. One major consequence of learning multiple tasks in a continual learning (CL) setting -- where tasks are learned sequentially, and the model can only have access to one task at a time -- is catastrophic forgetting (McCloskey & Cohen, 1989).

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found