Large-scale Pre-trained Models are Surprisingly Strong in Incremental Novel Class Discovery

Liu, Mingxuan, Roy, Subhankar, Zhong, Zhun, Sebe, Nicu, Ricci, Elisa

arXiv.org Artificial Intelligence 

Discovering novel concepts from unlabelled data and in In this work we study the problem of Novel Class Discovery a continuous manner is an important desideratum of lifelong (NCD) [19] where the goal is to train neural networks learners. In the literature such problems have been to discover (or group) novel visual concepts present partially addressed under very restricted settings, where in an unlabelled dataset into semantically meaningful clusters, either access to labelled data is provided for discovering while leveraging prior knowledge learned from supervised novel concepts (e.g., NCD) or learning occurs for a limited pre-training on a labelled dataset containing disjoint number of incremental steps (e.g., class-iNCD). In this work classes (see Fig 1b). Note that NCD is different from fully we challenge the status quo and propose a more challenging unsupervised clustering as there can be several criteria to and practical learning paradigm called MSc-iNCD, where cluster a dataset unsupervisedly (see Figure 1a). Ever since learning occurs continuously and unsupervisedly, while exploiting the pioneering work by Han et al., [19] the field of NCD has the rich priors from large-scale pre-trained models.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found