A Kullback-Leibler Divergence Based Kernel for SVM Classification in Multimedia Applications
Moreno, Pedro J., Ho, Purdy P., Vasconcelos, Nuno
–Neural Information Processing Systems
Over the last years significant efforts have been made to develop kernels that can be applied to sequence data such as DNA, text, speech, video and images. The Fisher Kernel and similar variants have been suggested as good ways to combine an underlying generative model in the feature space and discriminant classifiers such as SVM's. In this paper we suggest analternative procedure to the Fisher kernel for systematically finding kernel functions that naturally handle variable length sequence data in multimedia domains. In particular for domains such as speech and images we explore the use of kernel functions that take full advantage of well known probabilistic models such as Gaussian Mixtures and single fullcovariance Gaussian models. We derive a kernel distance based on the Kullback-Leibler (KL) divergence between generative models. In effect our approach combines the best of both generative and discriminative methodsand replaces the standard SVM kernels. We perform experiments on speaker identification/verification and image classification tasksand show that these new kernels have the best performance in speaker verification and mostly outperform the Fisher kernel based SVM's and the generative classifiers in speaker identification and image classification.
Neural Information Processing Systems
Dec-31-2004