### 7 Types of Regression Techniques you should know

Linear and Logistic regressions are usually the first algorithms people learn in predictive modeling. Due to their popularity, a lot of analysts even end up thinking that they are the only form of regressions. The ones who are slightly more involved think that they are the most important amongst all forms of regression analysis. The truth is that there are innumerable forms of regressions, which can be performed. Each form has its own importance and a specific condition where they are best suited to apply.

### 7 Types of Regression Techniques you should know

This article was posted by Sunil Ray. Sunil is a Business Analytics and Intelligence professional with deep experience in the Indian Insurance industry. Linear and Logistic regressions are usually the first algorithms people learn in predictive modeling. Due to their popularity, a lot of analysts even end up thinking that they are the only form of regressions. The ones who are slightly more involved think that they are the most important amongst all forms of regression analysis.

### Learn the Concept of linearity in Regression Models

This Tutorial talks about basics of Linear regression by discussing in depth about the concept of Linearity and Which type of linearity is desirable. Linear regression however always means linearity in parameters, irrespective of linearity in explanatory variables. Here the variable X can be non linear i.e X or X² and still we can consider this as a linear regression. However if our parameters are not linear i.e say the regression equation is A function Y f(x) is said to be linear in X if X appears with a power or index of 1 only. Y is linearly related to X if the rate of change of Y with respect to X (dY/dX) is independent of the value of X.

### Learn the Concept of linearity in Regression Models

This Tutorial talks about basics of Linear regression by discussing in depth about the concept of Linearity and Which type of linearity is desirable. Linear regression however always means linearity in parameters, irrespective of linearity in explanatory variables. Here the variable X can be non linear i.e X or X² and still we can consider this as a linear regression. However if our parameters are not linear i.e say the regression equation is A function Y f(x) is said to be linear in X if X appears with a power or index of 1 only. Y is linearly related to X if the rate of change of Y with respect to X (dY/dX) is independent of the value of X. B2 is Linear but B1 is non-linear but if we transform?

### 23 types of regression

This contribution is from David Corliss. David teaches a class on this subject, giving a (very brief) description of 23 regression methods in just an hour, with an example and the package and procedures used for each case.