Abstract—Backpropagation (BP) has been widely used as a
de-facto standard algorithm to compute weights for deep neural
networks (DNNs). The BP method is based on a stochastic
gradient descent method using the derivatives of an objective
function. As another approach, an alternating optimization
method using linear and nonlinear semi-nonnegative matrix
factorizations (semi-NMFs) has been proposed recently for
computing weight matrices of fully-connected DNNs without
bias vectors and regularization. In this paper, we proposed an
improvement of the nonlinear semi-NMF based method by
considering bias vectors and regularization. Experimental
results indicate that the proposed method shows higher
recognition performance than the nonlinear semi-NMF based
method and competitive advantages to the conventional BP
Index Terms—Deep neural networks, nonlinear semi-nonnegative matrix factorization, regularization term.
The authors are with the Computer Science, University of Tsukuba, Tsukuba, Japan (e-mail: firstname.lastname@example.org, email@example.com, firstname.lastname@example.org).
Cite: Ryosuke Arai, Akira Imakura, and Tetsuya Sakurai, "An Improvement of the Nonlinear Semi-NMF Based Method by Considering Bias Vectors and Regularization for Deep Neural Networks," International Journal of Machine Learning and Computing vol. 8, no. 3, pp. 191-197, 2018.