Goto

Collaborating Authors

15 Types of Regression you should know

@machinelearnbot

In case of multiple variables say X1 and X2, we can create a third new feature (say X3) which is the product of X1 and X2 i.e. Firstly we read the data using read.csv()


Predicting Conditional Quantiles via Reduction to Classification

arXiv.org Machine Learning

We show how to reduce the process of predicting general order statistics (and the median in particular) to solving classification. The accompanying theoretical statement shows that the regret of the classifier bounds the regret of the quantile regression under a quantile loss. We also test this reduction empirically against existing quantile regression methods on large real-world datasets and discover that it provides state-of-the-art performance.


Quantile Regression in Python

@machinelearnbot

You see that our intercept is 6.0398 and our slope or the coefficient for our x is 0.0934. These are the parameters for the 0.5th quantile of our y. Similarly we can do the models for other quantiles. In side the for loop we build models for each quantile in our list quantiles. As we build these models we us also store the model parameters in a list called params.


23-types-of-regression

@machinelearnbot

This contribution is from David Corliss. David teaches a class on this subject, giving a (very brief) description of 23 regression methods in just an hour, with an example and the package and procedures used for each case. Here you can check the webcast done for Central Michigan University. For instance, I would add piecewise linear regression, as well as regression on unusual domains (on a sphere or on the simplex.)