Goto

Collaborating Authors

Statistical Learning


10 Best Machine Learning Courses Online for Beginners

#artificialintelligence

Do you want to learn Machine Learning and looking for the Best Machine Learning Courses Online for Beginners?… If yes, then this article is for you. In this article, you will find the 10 best machine learning courses online for beginners. So, give your few minutes to this article and find out the best machine learning course online for beginners. Now without any further ado, let's get started- This is one of the Best Online Courses for Machine Learning Beginners.


A Guide to Generalization and Regularization in Machine Learning

#artificialintelligence

Generalization and Regularization are two often terms that have the most significant role when you aim to build a robust machine learning model. The one-term refers to the model behaviour and another term is responsible for enhancing the model performance. In a straightforward way, it can be said that regularization helps the machine learning models for better generalization. In this post, we will cover each aspect of these terms and try to understand how these are linked to each other. The major points to be discussed in this article are outlined below.


Vector Calculus for Machine Learning

#artificialintelligence

To keep this post as engaging and entertaining as possible I will first introduce a brief history of Calculus and why I think it is so cool. Then, we will move on to reviewing fundamental concepts of your high school calculus such as derivative rules. Next, we will get our feet wet with vectors and matrices to make sure you are comfortable with these mathematical objects before covering partial and vector derivatives. Finally, I will conclude this post with the concept of a gradient, the intuition behind optimization with Gradient Descent and a cool implementation of calculus with Python leveraging the library SimPy. Feel free to skip any sections you like if you are comfortable with such topics. At the core, Calculus is just a very special way of thinking about large problems by splitting them into several, smaller, problems.


Support Vector Machines, Illustrated

#artificialintelligence

Support vector machines are a class of techniques in data science, which had great popularity in the data science community. They are mainly used in classification tasks and perform really well when few training data is available. Sadly, SVMs have been almost forgotten lately due to the massive popularity of deep learning. But I my opinion they are a tool that every data scientist should have in their toolbox, because they are faster to train and sometimes even outperform neural networks. In this blog, you will learn that SVMs use hyperplanes to separate and classify our data.


Support Vector Machines, Illustrated

#artificialintelligence

Support vector machines are a class of techniques in data science, which had great popularity in the data science community. They are mainly used in classification tasks and perform really well when few training data is available. Sadly, SVMs have been almost forgotten lately due to the massive popularity of deep learning. But I my opinion they are a tool that every data scientist should have in their toolbox, because they are faster to train and sometimes even outperform neural networks. In this blog, you will learn that SVMs use hyperplanes to separate and classify our data.


Four Basic Steps in Data Preparation - KDnuggets

#artificialintelligence

What are the steps in data preparation? Are there specific steps we need to take for specific problems? The answer is not that straightforward: Practice and knowledge will design the best recipe for each case. First, there are two types of data preparation: KPI calculation to extract the information from the raw data and data preparation for the data science algorithm. While the first one is domain and business dependent, the second one is more standardized.


t-SNE Machine Learning Algorithm -- A Great Tool for Dimensionality Reduction in Python

#artificialintelligence

A successful data scientist understands a wide range of Machine Learning algorithms and can explain the results to stakeholders. But, unfortunately, not every stakeholder has a sufficient amount of training to grasp the complexities of ML. Luckily, we can aid our explanations by using dimensionality reduction techniques to create visual representations of high dimensional data. This article will take you through one such technique called t-Distributed Stochastic Neighbor Embedding (t-SNE). Perfect categorization of Machine Learning techniques is not always possible due to the flexibility demonstrated by specific algorithms, making them useful when solving different problems (e.g., one can use k-NN for regression and classification).


Gradient Descent with 'Math'

#artificialintelligence

In the last blog we saw about basics of Gradient Descent and how it works.This time we will see math behind it. We are actually subtracting some part from value of parameter and updating it.We keep doing this until we get optimized value of parameter so the cost is minimum. You may be thinking that why '-' sign is used in above equation. If you look at image below, in the right side of curve slope is positive so by subtracting value from theta, we are actually getting closer to the optimal value, while on the left side the slope is negative so we are actually adding some part in value of theta and so getting closer to the optimal value. We keep updating value of theta until the change in value 0.001 (values may vary according to case).Usually we take value of learning rate as 0.01


An overview on gradient descent and its variants

#artificialintelligence

The term "optimization" refers to the process of iteratively training a model to produce a maximum and minimum function evaluation to get a minimum cost function. It is crucial since it will assist us in obtaining a model with the least amount of error (as there will be discrepancies between the actual and predicted values). There are various optimization methods; in this article, we'll look at gradient descent and its three forms: batch, stochastic, and mini-batch. Note: Hyperparameter optimization is required to fine-tune the model. Before you begin training the model, you must first specify hyperparameters.


Support vector machines illustrated

#artificialintelligence

Support vector machines are a class of techniques in data science, which had great popularity in the data science community. They are mainly used in classification tasks and perform really well when few training data is available.