Goto

Collaborating Authors

 transductive svm


Export Reviews, Discussions, Author Feedback and Meta-Reviews

Neural Information Processing Systems

Q2: Please summarize your review in 1-2 sentences The paper proposes a modified SVM learning algorithm in which the loss function is modified by a per-example weight. However, example dependent costs are already widely used in machine learning.


Efficient Convex Relaxation for Transductive Support Vector Machine

Neural Information Processing Systems

We consider the problem of Support Vector Machine transduction, which involves a combinatorial problem with exponential computational complexity in the number of unlabeled examples. Although several studies are devoted to Transductive SVM, they suffer either from the high computation complexity or from the solutions of local optimum. To address this problem, we propose solving Transductive SVM via a convex relaxation, which converts the NP-hard problem to a semi-definite programming. Compared with the other SDP relaxation for Transductive SVM, the proposed algorithm is computationally more efficient with the number of free parameters reduced from O(n2) to O(n) where n is the number of examples. Empirical study with several benchmark data sets shows the promising performance of the proposed algorithm in comparison with other state-of-the-art implementations of Transductive SVM.


Efficient Convex Relaxation for Transductive Support Vector Machine

Xu, Zenglin, Jin, Rong, Zhu, Jianke, King, Irwin, Lyu, Michael

Neural Information Processing Systems

We consider the problem of Support Vector Machine transduction, which involves a combinatorial problem with exponential computational complexity in the number of unlabeled examples. Although several studies are devoted to Transductive SVM, they suffer either from the high computation complexity or from the solutions of local optimum. To address this problem, we propose solving Transductive SVM via a convex relaxation, which converts the NP-hard problem to a semi-definite programming. Compared with the other SDP relaxation for Transductive SVM, the proposed algorithm is computationally more efficient with the number of free parameters reduced from O(n2) to O(n) where n is the number of examples. Empirical study with several benchmark data sets shows the promising performance of the proposed algorithm in comparison with other state-of-the-art implementations of Transductive SVM.


Efficient Convex Relaxation for Transductive Support Vector Machine

Xu, Zenglin, Jin, Rong, Zhu, Jianke, King, Irwin, Lyu, Michael

Neural Information Processing Systems

We consider the problem of Support Vector Machine transduction, which involves a combinatorial problem with exponential computational complexity in the number of unlabeled examples. Although several studies are devoted to Transductive SVM, they suffer either from the high computation complexity or from the solutions of local optimum. To address this problem, we propose solving Transductive SVM via a convex relaxation, which converts the NP-hard problem to a semi-definite programming. Compared with the other SDP relaxation for Transductive SVM, the proposed algorithm is computationally more efficient with the number of free parameters reduced from O(n2) to O(n) where n is the number of examples. Empirical study with several benchmark data sets shows the promising performance of the proposed algorithm in comparison with other state-of-the-art implementations of Transductive SVM.


Semi-supervised Learning via Gaussian Processes

Lawrence, Neil D., Jordan, Michael I.

Neural Information Processing Systems

We present a probabilistic approach to learning a Gaussian Process classifier in the presence of unlabeled data. Our approach involves a "null category noise model" (NCNM) inspired by ordered categorical noisemodels. The noise model reflects an assumption that the data density is lower between the class-conditional densities. We illustrate our approach on a toy problem and present comparative resultsfor the semi-supervised classification of handwritten digits.


Semi-supervised Learning via Gaussian Processes

Lawrence, Neil D., Jordan, Michael I.

Neural Information Processing Systems

We present a probabilistic approach to learning a Gaussian Process classifier in the presence of unlabeled data. Our approach involves a "null category noise model" (NCNM) inspired by ordered categorical noise models. The noise model reflects an assumption that the data density is lower between the class-conditional densities. We illustrate our approach on a toy problem and present comparative results for the semi-supervised classification of handwritten digits.


Semi-supervised Learning via Gaussian Processes

Lawrence, Neil D., Jordan, Michael I.

Neural Information Processing Systems

We present a probabilistic approach to learning a Gaussian Process classifier in the presence of unlabeled data. Our approach involves a "null category noise model" (NCNM) inspired by ordered categorical noise models. The noise model reflects an assumption that the data density is lower between the class-conditional densities. We illustrate our approach on a toy problem and present comparative results for the semi-supervised classification of handwritten digits.