The Evolution of Boosting Algorithms
Decision Trees are used in statistics, data mining and machine learning and they are a supervised learning method which can be applied in both classification and regression. But the Decision Trees can be improved using boosting as it was first described by Schapire in his paper "The Strength of Weak Learnability "[1]. Basically, a boosting algorithm is a learning algorithm that will take advantage of the weak learners in order to generate high-accuracy hypotheses. However, over the years the algorithm has been improved and adapted by various contributors. The fact that the algorithm suffered a series of mutation that lead to algorithms like XGBoost, AdaBoost, Gradient Boost, LightGBM, is proof that the main idea has passed "the test of time".
Jan-31-2023, 10:05:55 GMT