AdaCap: An Adaptive Contrastive Approach for Small-Data Neural Networks
Belucci, Bruno, Lounici, Karim, Meziani, Katia
–arXiv.org Artificial Intelligence
Neural networks struggle on small tabular datasets, where tree-based models remain dominant. We introduce Adaptive Contrastive Approach (AdaCap), a training scheme that combines a permutation-based contrastive loss with a Tikhonov-based closed-form output mapping. Across 85 real-world regression datasets and multiple architectures, AdaCap yields consistent and statistically significant improvements in the small-sample regime, particularly for residual models. A meta-predictor trained on dataset characteristics (size, skewness, noise) accurately anticipates when AdaCap is beneficial. These results show that AdaCap acts as a targeted regularization mechanism, strengthening neural networks precisely where they are most fragile. All results and code are publicly available at https://github.com/BrunoBelucci/adacap.
arXiv.org Artificial Intelligence
Nov-26-2025
- Country:
- Europe > France
- Île-de-France > Paris > Paris (0.04)
- North America > United States
- Louisiana > Orleans Parish > New Orleans (0.04)
- Oceania > Australia
- New South Wales > Sydney (0.04)
- Europe > France
- Genre:
- Research Report (0.70)
- Technology: