Goto

Collaborating Authors

Support Vector Machines, Dual Formulation, Quadratic Programming & Sequential Minimal Optimization

#artificialintelligence

The Support-vector Machine (or called Support-vector Networks initially by the author -- Vladimir Vapnik) takes a completely different approach to solving statistical problems (in specific Classification). This algorithm has been heavily used in several classification problems like Image Classification, Bag-of-Words Classifier, OCR, Cancer prediction, and many more. SVM is basically a binary classifier, although it can be modified for multi-class classification as well as regression. Unlike logistic regression and other neural network models, SVMs try to maximize the separation between two classes of points. A brilliant idea is used by the author.


SVMs in One Picture

#artificialintelligence

SVMs (Support Vector Machines) are a way to classify data by finding the optimal plane or hyperplane that separates the data. In 2D, the separation is a plane; In higher dimensions, it's a hyperplane. For simplicity, the following picture shows how SVM works for a two-dimensional set.


Comparing machine learning classifiers based on their hyperplanes, for "package-users"

@machinelearnbot

In English-version of my personal blog, I posted an article about how we should understand a nature of each machine learning classifier; my solution is "just looking at a hyperplane of each".


Support Vector Machines for Binary Classification - MATLAB & Simulink

#artificialintelligence

You can use a support vector machine (SVM) when your data has exactly two classes. An SVM classifies data by finding the best hyperplane that separates all data points of one class from those of the other class. The best hyperplane for an SVM means the one with the largest margin between the two classes. Margin means the maximal width of the slab parallel to the hyperplane that has no interior data points.


Generalized version of the support vector machine for binary classification problems: supporting hyperplane machine

arXiv.org Machine Learning

In this paper there is proposed a generalized version of the SVM for binary classification problems in the case of using an arbitrary transformation x -> y. An approach similar to the classic SVM method is used. The problem is widely explained. Various formulations of primal and dual problems are proposed. For one of the most important cases the formulae are derived in detail. A simple computational example is demonstrated. The algorithm and its implementation is presented in Octave language.