A Generalization of Principal Component Analysis

Battaglino, Samuele, Koyuncu, Erdem

arXiv.org Machine Learning 

Samuele Battaglino and Erdem Koyuncu † Abstract --Conventional principal component analysis (PCA) finds a principal vector that maximizes the sum of second powers of principal components. We consider a generalized PCA that aims at maximizing the sum of an arbitrary convex function of principal components. We present a gradient ascent algorithm to solve the problem. For the kernel version of generalized PCA, we show that the solutions can be obtained as fixed points of a simple single-layer recurrent neural network. We also evaluate our algorithms on different datasets. I NTRODUCTION A. Conventional Principal Component Analysis (PCA) PCA and variant methods are dimension reduction techniques that rely on orthogonal transformations [1]-[3].

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found