A study of tree-based methods and their combination
With the increase of data volume and the continuous development in deep learning, although more and more traditional machine learning techniques are outperformed by artificial neural networks, tree-based methods are still popular. Random forest (Breiman, 2001) is commonly used as a benchmark to evaluate the performance of nonparametric models, while XGBoost (Chen and Guestrin, 2016) performs well in Kaggle competitions and often competes with artificial neural networks. Also, instead of relying on a specific method, people prefer to make decisions based on a combination of multiple models, which shows a better performance than a single one. Therefore, identifying the importance of each model by weights assignment is critical.
Apr-29-2022
- Country:
- North America > United States > California (0.04)
- Genre:
- Research Report (0.50)
- Technology:
- Information Technology > Artificial Intelligence > Machine Learning
- Decision Tree Learning (1.00)
- Ensemble Learning (0.92)
- Neural Networks (1.00)
- Statistical Learning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning