Goto

Collaborating Authors

Steps of Modelling

@machinelearnbot

The data are usually recorded in rows and columns. A column represents a variable,whereas a row represents an observation, which is a set of p 1 values for a single subject i.e. one value for the response variable and one value for each of the p predictors. Each of the variables can be classified as either quantitative or qualitative. A technique used in cases where the response variable is binary is called logistic regression. In regression analysis, the predictor variables can be either quantitative and or qualitative.



Poisson Subsampling Algorithms for Large Sample Linear Regression in Massive Data

arXiv.org Machine Learning

Large sample size brings the computation bottleneck for modern data analysis. Subsampling is one of efficient strategies to handle this problem. In previous studies, researchers make more fo- cus on subsampling with replacement (SSR) than on subsampling without replacement (SSWR). In this paper we investigate a kind of SSWR, poisson subsampling (PSS), for fast algorithm in ordinary least-square problem. We establish non-asymptotic property, i.e, the error bound of the correspond- ing subsample estimator, which provide a tradeoff between computation cost and approximation efficiency. Besides the non-asymptotic result, we provide asymptotic consistency and normality of the subsample estimator. Methodologically, we propose a two-step subsampling algorithm, which is efficient with respect to a statistical objective and independent on the linear model assumption.. Synthetic and real data are used to empirically study our proposed subsampling strategies. We argue by these empirical studies that, (1) our proposed two-step algorithm has obvious advantage when the assumed linear model does not accurate, and (2) the PSS strategy performs obviously better than SSR when the subsampling ratio increases.


5 Questions which can teach you Multiple Regression (with R and Python)

@machinelearnbot

A journey of thousand miles begin with a single step. In a similar way, the journey of mastering machine learning algorithms begins ideally with Regression. It is simple to understand, and gets you started with predictive modeling quickly. While this ease is good for a beginner, I always advice them to also understand the working of regression before they start using it. Lately, I have seen a lot of beginners, who just focus on learning how to perform regression (in R or Python) but not on the actual science behind it.


Learn Generalized Linear Models (GLM) using R

@machinelearnbot

Generalized Linear Model (GLM) helps represent the dependent variable as a linear combination of independent variables. Simple linear regression is the traditional form of GLM. Simple linear regression works well when the dependent variable is normally distributed. The assumption of normally distributed dependent variable is often violated in real situations. For example, consider a case where dependent variable can take only positive values and has fat tail.