CHIPS: Efficient CLIP Adaptation via Curvature-aware Hybrid Influence-based Data Selection

Zhuang, Xinlin, Li, Yichen, Liu, Xiwei, Yang, Haolin, Lu, Yifan, Zou, Ziyun, Li, Yulong, Li, Huifa, Chen, Dongliang, Wang, Qinglei, Liu, Weiyang, Qian, Ying, Shi, Jiangming, Razzak, Imran

arXiv.org Artificial Intelligence 

Adapting CLIP to vertical domains is typically approached by novel fine-tuning strategies or by continual pre-training (CPT) on large domain-specific datasets. Y et, data itself remains an underexplored factor in this process. W e revisit this task from a data-centric perspective: Can effective data selection substitute for large-scale datasets in CPT? W e introduce CHIPS (Curvature-aware Hybrid Influence in Projection Subspace), which assigns each image-text pair a utility score that integrates three complementary factors aligned with three goals: faithfulness via a curvature-aware, Newton-style alignment computed in CLIP's endpoint subspace; scalability via an InfoNCE-aware curvature estimator with Johnson-Lindenstrauss (JL) sketching; and retention via a selection-aware relevance weight combined with learnability to balance target adaptation against general-domain preservation. W e justify this design theoretically by proving a lower-bound guarantee on the proxy's correlation with full-parameter alignment and by characterizing the bias-variance trade-offs introduced by curvature mixing and JL sketching. W e evaluate CHIPS empirically across various settings: 1) CHIPS attains state-of-the-art performance among selection baselines on 17 medical benchmarks, matches full-dataset CPT with 30% of the data, and outperforms half-dataset CPT using only 10%; 2) on 31 general-domain benchmarks, CHIPS yields the smallest performance drop under 10-30% data-retention budgets. Code, data, and checkpoints will be released.