Gaussian and Wishart Hyperkernels

Kondor, Risi, Jebara, Tony

Neural Information Processing Systems 

We propose a new method for constructing hyperkenels and define two promising special cases that can be computed in closed form. These we call the Gaussian and Wishart hyperkernels. The former is especially attractive in that it has an interpretable regularization scheme reminiscent of that of the Gaussian RBF kernel. We discuss how kernel learning can be used not just for improving the performance of classification and regression methods, but also as a stand-alone algorithm for dimensionality reduction and relational or metric learning.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found