On the Trade-off of Intra-/Inter-class Diversity for Supervised Pre-training
–Neural Information Processing Systems
Pre-training datasets are critical for building state-of-the-art machine learning models, motivating rigorous study on their impact on downstream tasks. In this work, we study the impact of the trade-off between the intra-class diversity (the number of samples per class) and the inter-class diversity (the number of classes) of a supervised pre-training dataset. Empirically, given a fixed pre-training dataset size, we find that the best downstream performance comes with a balance on the intra-/inter-class diversity. To understand the underlying mechanism, we show theoretically that downstream performance depends monotonically on both types of diversity.
Neural Information Processing Systems
May-25-2025, 11:51:54 GMT