A Generalization of Principal Component Analysis
Battaglino, Samuele, Koyuncu, Erdem
Samuele Battaglino and Erdem Koyuncu † Abstract --Conventional principal component analysis (PCA) finds a principal vector that maximizes the sum of second powers of principal components. We consider a generalized PCA that aims at maximizing the sum of an arbitrary convex function of principal components. We present a gradient ascent algorithm to solve the problem. For the kernel version of generalized PCA, we show that the solutions can be obtained as fixed points of a simple single-layer recurrent neural network. We also evaluate our algorithms on different datasets. I NTRODUCTION A. Conventional Principal Component Analysis (PCA) PCA and variant methods are dimension reduction techniques that rely on orthogonal transformations [1]-[3].
Nov-15-2019
- Country:
- Europe > Germany
- Berlin (0.04)
- North America
- Canada > Quebec
- Montreal (0.04)
- United States
- Illinois > Cook County
- Chicago (0.04)
- Massachusetts > Middlesex County
- Cambridge (0.04)
- Illinois > Cook County
- Canada > Quebec
- Europe > Germany
- Genre:
- Research Report (0.50)
- Technology: