A Preconditioned Interior Point Method for Support Vector Machines Using an ANOVA-Decomposition and NFFT-Based Matrix-Vector Products
Wagner, Theresa, Pearson, John W., Stoll, Martin
–arXiv.org Artificial Intelligence
The training of support vector machines (SVMs) leads to large-scale quadratic programs (QPs) [1]. An efficient way to solve these optimization problems is the sequential minimal optimization (SMO) algorithm introduced in [2]. The main motivation for the design of the SMO algorithm comes from the fact that existing optimization methods, i.e., quadratic programming approaches, cannot handle the large-scale dense kernel matrix efficiently. The SMO algorithm is motivated by the result obtained in [3] that showed the optimization problem can be decomposed into the solution of smaller subproblems, which avoids the large-scale dense matrices. When tackling the training as a QP programming task, the use of interior point methods (IPM) has also been studied in the seminal paper of Fine and Scheinberg in [4]: the authors use a low-rank approximation of the kernel matrix and propose a pivoted low-rank Cholesky decomposition to approximate the kernel matrix. A similar matrix approximation was also proposed in [5].
arXiv.org Artificial Intelligence
Dec-1-2023
- Country:
- Europe
- Germany (0.04)
- Switzerland (0.04)
- United Kingdom > England
- Oxfordshire > Oxford (0.14)
- North America > United States
- Maryland > Baltimore (0.04)
- Massachusetts > Middlesex County
- Cambridge (0.04)
- New Jersey > Mercer County
- Princeton (0.04)
- Pennsylvania > Philadelphia County
- Philadelphia (0.04)
- Rhode Island > Providence County
- Providence (0.04)
- Europe
- Genre:
- Research Report (1.00)