Goto

Collaborating Authors

Machine Learning Basics: Building a Regression model in R

#artificialintelligence

You're looking for a complete Linear Regression course that teaches you everything you need to create a Linear Regression model in R, right? You've found the right Linear Regression course! How this course will help you? A Verifiable Certificate of Completion is presented to all students who undertake this Machine learning basics course. Why should you choose this course?


Machine Learning for Beginners-Regression Analysis in Python

#artificialintelligence

You're looking for a complete Linear Regression course that teaches you everything you need to create a Linear Regression model in Python, right? You've found the right Linear Regression course! Identify the business problem which can be solved using linear regression technique of Machine Learning. Create a linear regression model in Python and analyze its result. A Verifiable Certificate of Completion is presented to all students who undertake this Machine learning basics course.


Logistic Regression vs Decision Trees vs SVM: Part II

@machinelearnbot

This is the 2nd part of the series. In this part we'll discuss how to choose between Logistic Regression, Decision Trees and Support Vector Machines. The most correct answer as mentioned in the first part of this 2 part article, still remains it depends. We'll continue our effort to shed some light on, it depends on what. All three of these techniques have certain properties inherent by their design, we'll elaborate on some in order to provide you with few pointers on their selection for your particular business problem.


Sparse Hierarchical Regression with Polynomials

arXiv.org Machine Learning

We present a novel method for exact hierarchical sparse polynomial regression. Our regressor is that degree $r$ polynomial which depends on at most $k$ inputs, counting at most $\ell$ monomial terms, which minimizes the sum of the squares of its prediction errors. The previous hierarchical sparse specification aligns well with modern big data settings where many inputs are not relevant for prediction purposes and the functional complexity of the regressor needs to be controlled as to avoid overfitting. We present a two-step approach to this hierarchical sparse regression problem. First, we discard irrelevant inputs using an extremely fast input ranking heuristic. Secondly, we take advantage of modern cutting plane methods for integer optimization to solve our resulting reduced hierarchical $(k, \ell)$-sparse problem exactly. The ability of our method to identify all $k$ relevant inputs and all $\ell$ monomial terms is shown empirically to experience a phase transition. Crucially, the same transition also presents itself in our ability to reject all irrelevant features and monomials as well. In the regime where our method is statistically powerful, its computational complexity is interestingly on par with Lasso based heuristics. The presented work fills a void in terms of a lack of powerful disciplined nonlinear sparse regression methods in high-dimensional settings. Our method is shown empirically to scale to regression problems with $n\approx 10,000$ observations for input dimension $p\approx 1,000$.


How to improve model performance for regression problem?

@machinelearnbot

It simply means there is not enough features or information available from your independent variables. Figure out what other features you can add as independent variable which impacts the final dependent variable.