LibAUC: A Deep Learning Library for X-Risk Optimization

Yuan, Zhuoning, Zhu, Dixian, Qiu, Zi-Hao, Li, Gang, Wang, Xuanhui, Yang, Tianbao

arXiv.org Artificial Intelligence 

The Torch [36] have dramatically reduced the efforts of developers motivation of developing LibAUC is to address the convergence and researchers for implementing different DL methods without issues of existing libraries for solving these problems. In particular, worrying about low-level computations (e.g., automatic differentiation, existing libraries may not converge or require very large mini-batch tensor operations, etc). Based on these platforms, plenty sizes in order to attain good performance for these problems, due of DL libraries have been developed for different purposes, which to the usage of the standard mini-batch technique in the empirical can be organized into different categories including (i) supporting risk minimization (ERM) framework. Our library is for deep X-risk specific tasks [15, 35], e.g., TF-Ranking for LTR [35], VISSL for optimization (DXO) that has achieved great success in solving a variety self-supervised learning (SSL) [15], (ii) supporting specific data, of tasks for CID, LTR and CLR. The contributions of this paper e.g., DGL and DIG for graphs [31, 55]; (iii) supporting specific models include: (1) It introduces a new mini-batch based pipeline for implementing [13, 58, 59], e.g., Transformers for transformer models [59]. DXO algorithms, which differs from existing DL pipeline in However, it has been observed that these existing platforms and the design of controlled data samplers and dynamic mini-batch losses; libraries have encountered some unique challenges when solving (2) It provides extensive benchmarking experiments for ablation some classical and emerging problems in AI, including classification studies and comparison with existing libraries. The LibAUC library for imbalanced data (CID), learning to rank (LTR), contrastive features scalable performance for millions of items to be contrasted, learning of representations (CLR). In particular, prior works have faster and better convergence than existing libraries for optimizing observed that large mini-batch sizes are necessary to attain good X-risks, seamless PyTorch deployment and versatile APIs for various performance for these problems [4, 5, 7, 37, 43, 46], which restricts loss optimization. Our library is available to the open source the capabilities of these AI models in the real-world.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found