MOKD: Cross-domain Finetuning for Few-shot Classification via Maximizing Optimized Kernel Dependence
Tian, Hongduan, Liu, Feng, Liu, Tongliang, Du, Bo, Cheung, Yiu-ming, Han, Bo
–arXiv.org Artificial Intelligence
In cross-domain few-shot classification, nearest Cross-domain few-shot classification (Dvornik et al., 2020; centroid classifier (NCC) aims to learn representations Li et al., 2021a; Liu et al., 2021a; Triantafillou et al., 2020), to construct a metric space where few-shot also known as CFC, is a learning paradigm which aims at classification can be performed by measuring the learning to perform classification on tasks sampled from similarities between samples and the prototype of previously unseen data or domains with only a few labeled each class. An intuition behind NCC is that each data available. Compared with conventional few-shot classification sample is pulled closer to the class centroid it belongs (Finn et al., 2017; Ravi & Larochelle, 2017; Snell to while pushed away from those of other et al., 2017; Vinyals et al., 2016) which learns to adapt to classes. However, in this paper, we find that there new tasks sampled from unseen data with the same distribution exist high similarities between NCC-learned representations as seen data, cross-domain few-shot classification of two samples from different classes. is a much more challenging learning task since there exist In order to address this problem, we propose a discrepancies between the distributions of source and target bi-level optimization framework, maximizing optimized domains (Chi et al., 2021; Kuzborskij & Orabona, 2013).
arXiv.org Artificial Intelligence
May-29-2024