On the Subspace Structure of Gradient-Based Meta-Learning
Tegnér, Gustaf, Reichlin, Alfredo, Yin, Hang, Björkman, Mårten, Kragic, Danica
–arXiv.org Artificial Intelligence
In this work we provide an analysis of the distribution of the post-adaptation parameters of Gradient-Based Meta-Learning (GBML) methods. Previous work has noticed how, for the case of imageclassification, this adaptation only takes place on the last layers of the network. We propose the more general notion that parameters are updated over a low-dimensional subspace of the same dimensionality as the task-space and show that this Figure 1: The space of task-adapted parameters for a sine holds for regression as well. Furthermore, the regression task embedded in two-dimensional space. The induced subspace structure provides a method to polar-coordinate structure of the task is preserved in the estimate the intrinsic dimension of the space of space of task-adapted parameters.
arXiv.org Artificial Intelligence
Sep-30-2022