One-Class Kernel Spectral Regression for Outlier Detection

Arashloo, Shervin Rahimzadeh, Kittler, Josef

arXiv.org Machine Learning 

The paper introduces a new efficient nonlinear one-class classifier formulated as the Rayleigh quotient criterion. The method, operating in a reproducing kernel Hilbert subspace, minimises the scatter of target distribution along an optimal projection direction while at the same time keeping projections of positive observations as distant as possible from the mean of the negative class. We provide a graph embedding view of the problem which can then be solved efficiently using the spectral regression approach. In this sense, unlike previous similar methods which often require costly eigen-computations of dense matrices, the proposed approach casts the problem under consideration into a regression framework which avoids eigen-decomposition computations. In particular, it is shown that the dominant complexity of the proposed method is the complexity of computing the kernel matrix. Additional appealing characteristics of the proposed one-class classifier are: 1-the ability to be trained in an incremental fashion (allowing for application in streaming data scenarios while also reducing computational complexity in a non-streaming operation mode); 2-being unsupervised while also providing the functionality for refining the solution using negative training examples, in case available; And last but not least 3-the deployment of the kernel trick allowing for nonlinearly mapping the data into a high-dimensional feature space. Extensive experiments conducted on several datasets verify the merits of the proposed approach in comparison with some other alternatives.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found