Generalized KPCA by adaptive rules in feature space

Yanwei Pang, Lei Wang, Yuan Yuan*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Principal component analysis (PCA) is well recognized in dimensionality reduction, and kernel PCA (KPCA) has also been proposed in statistical data analysis. However, KPCA fails to detect the nonlinear structure of data well when outliers exist. To reduce this problem, this paper presents a novel algorithm, named iterative robust KPCA (IRKPCA). IRKPCA works well in dealing with outliers, and can be carried out in an iterative manner, which makes it suitable to process incremental input data. As in the traditional robust PCA (RPCA), a binary field is employed for characterizing the outlier process, and the optimization problem is formulated as maximizing marginal distribution of a Gibbs distribution. In this paper, this optimization problem is solved by stochastic gradient descent techniques. In IRKPCA, the outlier process is in a high-dimensional feature space, and therefore kernel trick is used. IRKPCA can be regarded as a kernelized version of RPCA and a robust form of kernel Hebbian algorithm. Experimental results on synthetic data demonstrate the effectiveness of IRKPCA.

Original languageEnglish
Pages (from-to)956-968
Number of pages13
JournalInternational Journal of Computer Mathematics
Volume87
Issue number5
Early online date10 May 2010
DOIs
Publication statusPublished - 2010

Keywords

  • dimension reduction
  • feature extraction
  • iterative robust KPCA
  • KPCA
  • outliers
  • PCA

Fingerprint

Dive into the research topics of 'Generalized KPCA by adaptive rules in feature space'. Together they form a unique fingerprint.

Cite this