论文标题

核心GPU梯度提升

Out-of-Core GPU Gradient Boosting

论文作者

Ou, Rong

论文摘要

基于GPU的算法大大加速了许多机器学习方法。但是,GPU存储器通常小于主内存,从而限制了训练数据的大小。在本文中,我们描述了XGBoost库中实现的核心GPU梯度增强算法。我们表明,更大的数据集可以适合给定的GPU,而无需降低模型的准确性或培训时间。据我们所知,这是梯度提升的第一个核心GPU实施。类似的方法可以应用于其他机器学习算法

GPU-based algorithms have greatly accelerated many machine learning methods; however, GPU memory is typically smaller than main memory, limiting the size of training data. In this paper, we describe an out-of-core GPU gradient boosting algorithm implemented in the XGBoost library. We show that much larger datasets can fit on a given GPU, without degrading model accuracy or training time. To the best of our knowledge, this is the first out-of-core GPU implementation of gradient boosting. Similar approaches can be applied to other machine learning algorithms

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源