Model-Robust and Adaptive-Optimal Transfer Learning for Tackling Concept Shifts in Nonparametric Regression
Lin, Haotian, Reimherr, Matthew
Nonparametric regression is one of the most extensively studied problems in past decades due to its remarkable flexibility in modeling the relationship between an input X and output Y . While numerous algorithms have been developed, the strong guarantees of learnability and generalization rely on the fact that there are a sufficient number of training samples and that the future data possess the same distribution as the training. However, training sample scarcity in the target domain of interest and distribution shifts occur frequently in practical applications and deteriorate the effectiveness of most existing algorithms both empirically and theoretically. Transfer learning has emerged as an appealing and promising paradigm for addressing these challenges by leveraging samples or pre-trained models from similar, yet not identical, source domains. In this work, we study the problem of transfer learning in the presence of the concept shifts for nonparametric regression over some specific reproducing kernel Hilbert spaces (RKHS). Specifically, we posit there are limited labeled samples from the target domain but sufficient labeled samples from a similar source domain where the concept shifted, namely, the conditional distribution of Y |X changes across domains, which implies the underlying regression function shifts.
Jan-18-2025