Reviews: Regularized Gradient Boosting

Neural Information Processing Systems 

This paper proposes Rademacher generalization bounds for Regularized Gradient Boosting which encompasses various accelerated GB methods. Although there are still some work to be done in order to make the proposed algorithm derived from the theoretical study faster but the proposed theoretical study deserves publication.