• Jun 14, 2017 News!Vol.6, No.3 has been indexed by EI(Inspec)!   [Click]
  • May 03, 2016 News!Vol.5, No.5 has been indexed by EI(Inspec)!   [Click]
  • May 03, 2016 News!Vol.5, No.4 has been indexed by EI(Inspec)!   [Click]
General Information
    • ISSN: 2010-3700
    • Frequency: Bimonthly
    • DOI: 10.18178/IJMLC
    • Editor-in-Chief: Dr. Lin Huang
    • Executive Editor:  Ms. Cherry L. Chen
    • Abstracing/Indexing: Engineering & Technology Digital Library, Google Scholar, Crossref, ProQuest, Electronic Journals Library, DOAJ and EI (INSPEC, IET).
    • E-mail: ijmlc@ejournal.net
Editor-in-chief
Dr. Lin Huang
Metropolitan State University of Denver, USA
It's my honor to take on the position of editor in chief of IJMLC. We encourage authors to submit papers concerning any branch of machine learning and computing.
IJMLC 2016 Vol.6(2): 111-116 ISSN: 2010-3700
DOI: 10.18178/ijmlc.2016.6.2.583

Fast Parallel Randomized Algorithm for Nonnegative Matrix Factorization with KL Divergence for Large Sparse Datasets

Duy-Khuong Nguyen and Tu-Bao Ho
Abstract—Nonnegative Matrix Factorization (NMF) with Kullback-Leibler Divergence (NMF-KL) is one of the most significant NMF problems and equivalent to Probabilistic Latent Semantic Indexing (PLSI), which has been successfully applied in many applications. For sparse count data, a Poisson distribution and KL divergence provide sparse models and sparse representation, which describe the random variation better than a normal distribution and Frobenius norm. Specially, sparse models provide more concise understanding of the appearance of attributes over latent components, while sparse representation provides concise interpretability of the contribution of latent components over instances. However, minimizing NMF with KL divergence is much more difficult than minimizing NMF with Frobenius norm; and sparse models, sparse representation and fast algorithms for large sparse datasets are still challenges for NMF with KL divergence. In this paper, we propose a fast parallel randomized coordinate descent algorithm having fast convergence for large sparse datasets to archive sparse models and sparse representation. The proposed algorithm’s experimental results overperform the current studies’ ones in this problem.

Index Terms—Nonnegative matrix factorization, kullback-leibler divergence, sparse models, and sparse representation.

D. K. Nguyen is with Japan Advanced Institute of Science and Technology, and University of Engineering and Technology, Vietnam National University, Hanoi, Vietnam (e-mail: khuongnd@gmail.com).
T. B. Ho was with Japan Advanced Institute of Science and Technology, and John von Neumann Institute, Vietnam National University, Ho Chi Minh City, Vietnam (e-mail: bao@jaist.ac.jp).

[PDF]

Cite: Duy-Khuong Nguyen and Tu-Bao Ho, "Fast Parallel Randomized Algorithm for Nonnegative Matrix Factorization with KL Divergence for Large Sparse Datasets," International Journal of Machine Learning and Computing vol.6, no. 2, pp. 111-116, 2016.

Copyright © 2008-2015. International Journal of Machine Learning and Computing. All rights reserved.
E-mail: ijmlc@ejournal.net