Continual Learning with Weight Interpolation
Kozal, Jędrzej, Wasilewski, Jan, Krawczyk, Bartosz, Woźniak, Michał
–arXiv.org Artificial Intelligence
This property is known as mode connectivity. Continual learning poses a fundamental challenge for One feature that may be considered when studying this phenomenon modern machine learning systems, requiring models to is the permutation invariance of neural networks adapt to new tasks while retaining knowledge from previous [12]. Neurons or kernels of network layers can be permuted ones. Addressing this challenge necessitates the development and, if neighboring layers' outputs and inputs are adjusted, of efficient algorithms capable of learning from data one can obtain a solution that has the same properties as streams and accumulating knowledge over time. This paper the original model but lies in a completely different part of proposes a novel approach to continual learning utilizing the loss landscape. Considering this fact, one may conclude the weight consolidation method. Our method, a simple that the abundance of local minima in the loss landscape of yet powerful technique, enhances robustness against neural networks results from permutation invariance. In a catastrophic forgetting by interpolating between old and follow-up work, Ainsworth et al. [1] showed how to find new model weights after each novel task, effectively merging permutations of weights that allow for a linear interpolation two models to facilitate exploration of local minima of weights with low or even near zero barriers.
arXiv.org Artificial Intelligence
Apr-9-2024
- Country:
- North America > United States (0.28)
- Genre:
- Research Report
- New Finding (0.68)
- Promising Solution (0.54)
- Research Report
- Industry:
- Education > Educational Setting > Online (0.46)
- Technology: