Out-of-Core GPU Gradient Boosting
GPU-based algorithms have greatly accelerated many machine learning methods; however, GPU memory is typically smaller than main memory, limiting the size of training data. In this paper, we describe an out-of-core GPU gradient boosting algorithm implemented in the XGBoost library. We show that much larger datasets can fit on a given GPU, without degrading model accuracy or training time. To the best of our knowledge, this is the first out-of-core GPU implementation of gradient boosting. Similar approaches can be applied to other machine learning algorithms
May-18-2020
- Country:
- North America > United States > California (0.46)
- Genre:
- Research Report (0.40)
- Industry:
- Information Technology (0.70)
- Technology: