Home > Archive > 2012 > Volume 2 Number 5 (Oct. 2012) >
IJMLC 2012 Vol.2(6): 701-705 ISSN: 2010-3700
DOI: 10.7763/IJMLC.2012.V2.218

Some Extension of Sparse Principal Component Analysis

Thanh D. X. Duong and Hung V. Nguyen

Abstract—Given a covariance matrix, sparse principal component analysis (SPCA) considers the problem of maximizing the variance explained by a particular linear combination of the input variables where the number of nonzero coefficients is constrained. In some applications, the coefficients in this combination are required to be non-negative. Moreover, when loading an input variable is associated an individual cost, we need incorporate weights, which represent the loading cost of input variables, into sparsity constraint. And in this paper, we consider problems of SPCA with weighted sparsity constraint and/or non-negative sparsity constraint. These problems are reduced to solving some semi-definite programming ones via convex relaxation technique. Numerical results show that the method is efficient and reliable in practice.

Index Terms—Iterative re-weighting, non-negative constraint principal component analysis, principal component analysis, semi-definite relaxation, sparse principal component analysis.

Thanh D. X. Duong is with the John von Neumann Institute, VNU-HCM, Vietnam (e-mail: thanh.duong@jvn.edu.vn).
Nguyen V. Hung is with Japan Advanced Institute of Science and Technology, Japan (e-mail: nvhung@jaist.ac.jp).

[PDF]

Cite:Thanh D. X. Duong and Hung V. Nguyen, "Some Extension of Sparse Principal Component Analysis," International Journal of Machine Learning and Computing vol.2, no. 5, pp. 701-705, 2012.

General Information

  • E-ISSN: 2972-368X
  • Abbreviated Title: Int. J. Mach. Learn.
  • Frequency: Quaterly
  • DOI: 10.18178/IJML
  • Editor-in-Chief: Dr. Lin Huang
  • Executive Editor:  Ms. Cherry L. Chen
  • Abstracing/Indexing: Inspec (IET), Google Scholar, Crossref, ProQuest, Electronic Journals LibraryCNKI.
  • E-mail: ijml@ejournal.net


Article Metrics in Dimensions