to

### Machine Learning Basics: Polynomial Regression

In previous stories, I have given a brief of Linear Regression and showed how to perform Simple and Multiple Linear Regression. In this article, we will go through the program for building a Polynomial Regression model based on the non-linear data. In the previous examples of Linear Regression, when the data is plotted on the graph, there was a linear relationship between both the dependent and independent variables. Thus, it was more suitable to build a linear model to get accurate predictions. What if the data points had the following non-linearity making the linear model giving an error in predictions due to non-linearity? In this case, we have to build a polynomial relationship which will accurately fit the data points in the given plot.

### Polynomial Regression -- explained

This article requires the knowledge of Linear Regression. If you haven't heard of it, then please check out an article on Linear Regression before you proceed here. Till now we assumed that the relationship between independent variable X and dependent Y can be represented with a straight line. But what if when we can't represent the relationship in a straight line because the data might not be linearly separable? In such kind of scenario we can look for polynomial regression.

### Selecting the Best Machine Learning Algorithm for Your Regression Problem

When approaching any type of Machine Learning (ML) problem there are many different algorithms to choose from. In machine learning, there's something called the "No Free Lunch" theorem which basically states that no one ML algorithm is best for all problems. The performance of different ML algorithms strongly depends on the size and structure of your data. Thus, the correct choice of algorithm often remains unclear unless we test out our algorithms directly through plain old trial and error. But, there are some pros and cons to each ML algorithm that we can use as guidance.

### Selecting the best Machine Learning algorithm for your regression problem

When approaching any type of Machine Learning (ML) problem there are many different algorithms to choose from. In machine learning, there's something called the "No Free Lunch" theorem which basically states that no one ML algorithm is best for all problems. The performance of different ML algorithms strongly depends on the size and structure of your data. Thus, the correct choice of algorithm often remains unclear unless we test out our algorithms directly through plain old trial and error. But, there are some pros and cons to each ML algorithm that we can use as guidance.

### Artificial Intelligence #2: Polynomial & Logistic Regression

In statistics, Logistic Regression, or logit regression, or logit model is a regression model where the dependent variable (DV) is categorical. This article covers the case of a binary dependent variable--that is, where the output can take only two values, "0" and "1", which represent outcomes such as pass/fail, win/lose, alive/dead or healthy/sick. Cases where the dependent variable has more than two outcome categories may be analysed in multinomial logistic regression, or, if the multiple categories are ordered, in ordinal logistic regression. In the terminology of economics, logistic regression is an example of a qualitative response/discrete choice model. Logistic Regression was developed by statistician David Cox in 1958.