If your data distribution shifts, use self-learning
Rusak, Evgenia, Schneider, Steffen, Pachitariu, George, Eck, Luisa, Gehler, Peter, Bringmann, Oliver, Brendel, Wieland, Bethge, Matthias
–arXiv.org Artificial Intelligence
We demonstrate that self-learning techniques like entropy minimization and pseudo-labeling are simple and effective at improving performance of a deployed computer vision model under systematic domain shifts. We conduct a wide range of large-scale experiments and show consistent improvements irrespective of the model architecture, the pre-training technique or the type of distribution shift. At the same time, self-learning is simple to use in practice because it does not require knowledge or access to the original training data or scheme, is robust to hyperparameter choices, is straight-forward to implement and requires only a few adaptation epochs. This makes self-learning techniques highly attractive for any practitioner who applies machine learning algorithms in the real world.
arXiv.org Artificial Intelligence
Dec-7-2023
- Country:
- Genre:
- Research Report > New Finding (0.67)
- Industry:
- Education (0.46)
- Information Technology (0.45)
- Technology: