Proximal gradient method for huberized support vector machine
Xu, Yangyang, Akrotirianakis, Ioannis, Chakraborty, Amit
The Support Vector Machine (SVM) has been used in a wide variety of classification problems. The original SVM uses the hinge loss function, which is non-differentiable and makes the problem difficult to solve in particular for regularized SVMs, such as with $\ell_1$-regularization. This paper considers the Huberized SVM (HSVM), which uses a differentiable approximation of the hinge loss function. We first explore the use of the Proximal Gradient (PG) method to solving binary-class HSVM (B-HSVM) and then generalize it to multi-class HSVM (M-HSVM). Under strong convexity assumptions, we show that our algorithm converges linearly. In addition, we give a finite convergence result about the support of the solution, based on which we further accelerate the algorithm by a two-stage method. We present extensive numerical experiments on both synthetic and real datasets which demonstrate the superiority of our methods over some state-of-the-art methods for both binary- and multi-class SVMs.
Nov-30-2015
- Country:
- North America > United States (0.67)
- Genre:
- Research Report
- Experimental Study (0.69)
- New Finding (0.93)
- Research Report
- Industry:
- Health & Medicine > Therapeutic Area > Oncology > Leukemia (0.46)
- Technology: