Smola, Alex J.
Fast Kernels for String and Tree Matching
Smola, Alex J., Vishwanathan, S.v.n.
In this paper we present a new algorithm suitable for matching discrete objects such as strings and trees in linear time, thus obviating dynarrtic programming with quadratic time complexity. Furthermore, prediction cost in many cases can be reduced to linear cost in the length of the sequence tobe classified, regardless of the number of support vectors. This improvement on the currently available algorithms makes string kernels a viable alternative for the practitioner.
Kernel Machines and Boolean Functions
Kowalczyk, Adam, Smola, Alex J., Williamson, Robert C.
We give results about the learnability and required complexity of logical formulae to solve classification problems. These results are obtained by linking propositional logic with kernel machines. In particular we show that decision trees and disjunctive normal forms (DNF) can be represented by the help of a special kernel, linking regularized risk to separation margin. Subsequently we derive a number of lower bounds on the required complexity of logic formulae using properties of algorithms for generation of linear estimators, such as perceptron and maximal perceptron learning.
Kernel Machines and Boolean Functions
Kowalczyk, Adam, Smola, Alex J., Williamson, Robert C.
We give results about the learnability and required complexity of logical formulae to solve classification problems. These results are obtained by linking propositional logic with kernel machines. In particular we show that decision trees and disjunctive normal forms (DNF) can be represented by the help of a special kernel, linking regularized risk to separation margin. Subsequently we derive a number of lower bounds on the required complexity of logic formulae using properties of algorithms for generation of linear estimators, such as perceptron and maximal perceptron learning.
Online Learning with Kernels
Kivinen, Jyrki, Smola, Alex J., Williamson, Robert C.
We consider online learning in a Reproducing Kernel Hilbert Space. Our method is computationally efficient and leads to simple algorithms. In particular we derive update equations for classification, regression, and novelty detection. The inclusion of the -trick allows us to give a robust parameterization.
Online Learning with Kernels
Kivinen, Jyrki, Smola, Alex J., Williamson, Robert C.
Kernel Machines and Boolean Functions
Kowalczyk, Adam, Smola, Alex J., Williamson, Robert C.
We give results about the learnability and required complexity of logical formulae to solve classification problems. These results are obtained by linking propositional logic with kernel machines. In particular we show that decision trees and disjunctive normal forms (DNF) can be represented bythe help of a special kernel, linking regularized risk to separation margin. Subsequently we derive a number of lower bounds on the required complexity of logic formulae using properties of algorithms for generation of linear estimators, such as perceptron and maximal perceptron learning.
Regularization with Dot-Product Kernels
Smola, Alex J., Óvári, Zoltán L., Williamson, Robert C.
Sparse Greedy Gaussian Process Regression
Smola, Alex J., Bartlett, Peter L.
Sparse Greedy Gaussian Process Regression
Smola, Alex J., Bartlett, Peter L.
Regularization with Dot-Product Kernels
Smola, Alex J., Óvári, Zoltán L., Williamson, Robert C.