• Mar 27, 2019 News!Good News! All papers from Volume 9, Number 1 have been indexed by Scopus!   [Click]
  • May 07, 2019 News!Vol.9, No.3 has been published with online version.   [Click]
  • Mar 30, 2019 News!Vol.9, No.2 has been published with online version.   [Click]
Search
General Information
    • ISSN: 2010-3700
    • Abbreviated Title: Int. J. Mach. Learn. Comput.
    • DOI: 10.18178/IJMLC
    • Editor-in-Chief: Dr. Lin Huang
    • Executive Editor:  Ms. Cherry L. Chen
    • Abstracing/Indexing: Scopus(since 2017), EI (INSPEC, IET), Google Scholar, Crossref, ProQuest, Electronic Journals Library.
    • E-mail: ijmlc@ejournal.net
Editor-in-chief
Dr. Lin Huang
Metropolitan State University of Denver, USA
It's my honor to take on the position of editor in chief of IJMLC. We encourage authors to submit papers concerning any branch of machine learning and computing.
IJMLC 2019 Vol.9(3): 267-272 ISSN: 2010-3700
DOI: 10.18178/ijmlc.2019.9.3.797

Forget the Learning Rate, Decay Loss

Jiakai Wei
Abstract—In the usual deep neural network optimization process, the learning rate is the most important hyper parameter, which greatly affects the final convergence effect. The purpose of learning rate is to control the stepsize and gradually reduce the impact of noise on the network. In this paper, we will use a fixed learning rate with method of decaying loss to control the magnitude of the update. We used Image classification, Semantic segmentation, and GANs to verify this method. Experiments show that the loss decay strategy can greatly improve the performance of the model.

Index Terms—Deep learning, optimization.

Jiakai Wei is with the Hunan University of Technology, China (e-mail: 16408400236@stu.hut.edu.cn).

[PDF]

Cite: Jiakai Wei, "Forget the Learning Rate, Decay Loss," International Journal of Machine Learning and Computing vol. 9, no. 3, pp. 267-272, 2019.

Copyright © 2008-2019. International Journal of Machine Learning and Computing. All rights reserved.
E-mail: ijmlc@ejournal.net