IJMLC 2018 Vol.8(3): 191-197 ISSN: 2010-3700
DOI: 10.18178/ijmlc.2018.8.3.686

An Improvement of the Nonlinear Semi-NMF Based Method by Considering Bias Vectors and Regularization for Deep Neural Networks

Ryosuke Arai, Akira Imakura, and Tetsuya Sakurai

Abstract—Backpropagation (BP) has been widely used as a de-facto standard algorithm to compute weights for deep neural networks (DNNs). The BP method is based on a stochastic gradient descent method using the derivatives of an objective function. As another approach, an alternating optimization method using linear and nonlinear semi-nonnegative matrix factorizations (semi-NMFs) has been proposed recently for computing weight matrices of fully-connected DNNs without bias vectors and regularization. In this paper, we proposed an improvement of the nonlinear semi-NMF based method by considering bias vectors and regularization. Experimental results indicate that the proposed method shows higher recognition performance than the nonlinear semi-NMF based method and competitive advantages to the conventional BP method.

Index Terms—Deep neural networks, nonlinear semi-nonnegative matrix factorization, regularization term.

The authors are with the Computer Science, University of Tsukuba, Tsukuba, Japan (e-mail: arai@mma.cs.tsukuba.ac.jp, imakura@cs.tsukuba.ac.jp, sakurai@cs.tsukuba.ac.jp).

[PDF]

Cite: Ryosuke Arai, Akira Imakura, and Tetsuya Sakurai, "An Improvement of the Nonlinear Semi-NMF Based Method by Considering Bias Vectors and Regularization for Deep Neural Networks," International Journal of Machine Learning and Computing vol. 8, no. 3, pp. 191-197, 2018.

General Information

  • ISSN: 2010-3700 (Online)
  • Abbreviated Title: Int. J. Mach. Learn. Comput.
  • Frequency: Bimonthly
  • DOI: 10.18178/IJMLC
  • Editor-in-Chief: Dr. Lin Huang
  • Executive Editor:  Ms. Cherry L. Chen
  • Abstracing/Indexing: Scopus (since 2017), EI (INSPEC, IET), Google Scholar, Crossref, ProQuest, Electronic Journals Library.
  • E-mail: ijmlc@ejournal.net