Few-Shot Learning with Class Imbalance
Ochal, Mateusz, Patacchiola, Massimiliano, Storkey, Amos, Vazquez, Jose, Wang, Sen
–arXiv.org Artificial Intelligence
Abstract--Few-Shot Learning (FSL) algorithms are commonly trained through Meta-Learning (ML), which exposes models to batches of tasks sampled from a meta-dataset to mimic tasks seen during evaluation. However, the standard training procedures overlook the real-world dynamics where classes commonly occur at different frequencies. While it is generally understood that class imbalance harms the performance of supervised methods, limited research examines the impact of imbalance on the FSL evaluation task. Our analysis compares 10 state-of-the-art meta-learning and FSL methods on different imbalance distributions and rebalancing techniques. Our results reveal that 1) some FSL methods display a natural disposition against imbalance while most other approaches produce a performance drop by up to 17% compared to the balanced task without the appropriate mitigation; 2) contrary to popular belief, many meta-learning algorithms will not automatically learn to balance from exposure to imbalanced training tasks; 3) classical rebalancing strategies, such as random oversampling, can still be very effective, leading to state-of-the-art performances and should not be overlooked; 4) FSL methods are more robust against meta-dataset imbalance than imbalance at the task-level with a similar imbalance ratio ( ρ < 20), with the effect holding even in long-tail datasets under a larger imbalance ( ρ = 65). We identify well to new examples. However, large datasets can be costly and examine three levels of class imbalance: task-level, to obtain and annotate [1]. This is a particularly limiting dataset-level, and combined (task-level and dataset-level) issue in many real-world situations due to the need to perform imbalance. In contrast to previous work on CIFSL [12], [13], real-time operations, the presence of rare categories, [14], [15], we explicitly attribute and quantify the impact on or the desire for a good user experience [2], [3], [4], [5]. the performance caused by class imbalance for each model. Few-Shot Learning (FSL) alleviates this burden by defining Moreover, we study multiple class imbalance distributions, a distribution over tasks, with each task containing a few giving a realistic assessment of performance and revealing labeled data points (support set) and a set of target data previously unknown strengths and weaknesses of 10 stateof-the-art (query set) belonging to the same set of classes. Additionally, we offer practical advice, way to train FSL methods is through Meta-Learning (ML). Figure 1 the model is repeatedly exposed to batches of tasks sampled shows a graphical representation of the CIFSL problem.
arXiv.org Artificial Intelligence
Jun-14-2021
- Genre:
- Research Report > New Finding (1.00)
- Industry:
- Education > Educational Setting (0.46)
- Technology: