Goto

Collaborating Authors

23 types of regression

@machinelearnbot

This contribution is from David Corliss. David teaches a class on this subject, giving a (very brief) description of 23 regression methods in just an hour, with an example and the package and procedures used for each case.


23 types of regression

@machinelearnbot

This contribution is from David Corliss. David teaches a class on this subject, giving a (very brief) description of 23 regression methods in just an hour, with an example and the package and procedures used for each case. Here you can check the webcast done for Central Michigan University. For instance, I would add piecewise linear regression, as well as regression on unusual domains (on a sphere or on the simplex.)


R FUNCTIONS FOR REGRESSION ANALYSIS – Step Up Analytics

#artificialintelligence

Here are some helpful R functions for regression analysis grouped by their goal. The name of package is in parentheses. Base has a method for objects inheriting from class "lm" (stasts) This is a generic function, but currently only has a methods for objects inheriting from classes "lm" and "glm" (stasts) AIC: Generic function calculating the Akaike information criterion for one or several fitted model objects for which a log-likelihood value can be obtained, according to the formula -2*log-likelihood k*npar, where npar represents the number of parameters in the fitted model, and k 2 for the usual AIC, or k log(n) (n the number of observations) for the so-called BIC or SBC (Schwarz's Bayesian criterion) (stats) Four plots (selectable by which) are currently provided: a plot of residuals against fitted values, a Scale-Location plot of sqrt{ residuals } against fitted values, a Normal Q-Q plot, and a plot of Cook's distances versus row labels (stats) Performs Bartlett's test of the null that the variances in each of the groups (samples) are the same (stats) bgtest: Breusch-Godfrey Test (lmtest) bptest: Breusch-Pagan Test (lmtest)



15 Types of Regression you should know

@machinelearnbot

In case of multiple variables say X1 and X2, we can create a third new feature (say X3) which is the product of X1 and X2 i.e. Firstly we read the data using read.csv()