Goto

Collaborating Authors

Hands-on Random Forest with Python

#artificialintelligence

One model may make a wrong prediction. But if you combine the predictions of several models into one, you can make better predictions. This concept is called ensemble learning. Ensembles are methods that combine multiple models to build more powerful models. Ensemble methods have gained huge popularity during the last decade.


Hyperparameter Tuning with Grid Search and Random Search

#artificialintelligence

Hyperparameters are parameters that are defined before training to specify how we want model training to happen. We have full control over hyperparameter settings and by doing that we control the learning process. It can be set to any integer value but of course, setting it to 10 or 1000 changes the learning process significantly. Parameters, on the other hand, are found during the training. We have no control over parameter values as they are the result of model training.


Dealing with Unbalanced Classes, SVMs, Random Forests, and Decision Trees in Python

@machinelearnbot

So far I have talked about decision trees and ensembles. But I hope, I have made you understand the logic behind these concepts without getting too much into the mathematical details. In this post lets get into action, I will be implementing the concepts that we learned in these two blog posts. The only concept that I haven't discussed about is SVM. I suggest you to watch Professor Andrew Ng's week 7 videos on Coursera.


Dealing with Unbalanced Classes, SVMs, Random Forests, and Decision Trees in Python

#artificialintelligence

So far I have talked about decision trees and ensembles. But I hope, I have made you understand the logic behind these concepts without getting too much into the mathematical details. In this post lets get into action, I will be implementing the concepts that we learned in these two blog posts. The only concept that I haven't discussed about is SVM. I suggest you to watch Professor Andrew Ng's week 7 videos on Coursera.


Hyperparameter Optimization Techniques to Improve Your Machine Learning Model's Performance

#artificialintelligence

When working on a machine learning project, you need to follow a series of steps until you reach your goal. One of the steps you have to perform is hyperparameter optimization on your selected model. This task always comes after the model selection process where you choose the model that is performing better than other models. Before I define hyperparameter optimization, you need to understand what a hyperparameter is. In short, hyperparameters are different parameter values that are used to control the learning process and have a significant effect on the performance of machine learning models. These parameters are tunable and can directly affect how well a model trains. So then hyperparameter optimization is the process of finding the right combination of hyperparameter values to achieve maximum performance on the data in a reasonable amount of time.