Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Gradient Boosting Machine (GBM) introduced by Friedman (2001) is a widely popular ensembling technique and is routinely used in competitions such as Kaggle and the KDD-Cup (Chen and Guestrin, 2016). In this work, we propose an Accelerated Gradient Boosting Machine (AGBM) by incorporating Nesterov's acceleration techniques into the design of GBM. The difficulty in accelerating GBM lies in the fact that weak (inexact) learners are commonly used, and therefore, with naive application, the errors can accumulate in the momentum term. To overcome it, we design a "corrected pseudo residual" that serves as a new target for fitting a weak learner, in order to perform the z-update. Thus, we are able to derive novel computational guarantees for AGBM. This is the first GBM type of algorithm with a theoretically-justified accelerated convergence rate.
Mahsa Shoaran, Bingzhao Zhu, Arman Zarei
Alessandro Mapelli, Radoslav Marchevski, Alina Kleimenova