Abstract—Training an artificial neural network (ANN) is an optimization task where the result is to find optimal weight and bias set of the network. There are many traditional method to training ANN, such as Back Propagation (BP) Algorithm, Levenberg-Marquadt (LM), Quasi-Newton (QN), Genetic Algorithm(GA) etc. Traditional training algorithms might get stuck in local minima and the global search techniques might catch global minima very slow. Recently differential evolution (DE) algorithm has been used in many practical cases and has demonstrated good convergence properties. In DE algorithm there are some parameters, which are kept fix throughout the entire evolution process. However we have to tune value of these control parameters and it is not easy to do. Therefore this research we apply the improvement of self-adaptive strategy for controlling parameters in differential evolution algorithm (ISADE-ANN) for training neural network. Experiment results show that the new algorithm ISADE-ANN has higher precision and better performance than traditional training algorithms.
Index Terms—Neural network training, differential evolution, global search, local search, multi-peak problems.
Ngoc Tam Bui is with the Graduate School of Engineering and Science, Shibaura Institute of Technology, Japan (e-mail: firstname.lastname@example.org).
Hiroshi Hasegawa is with the College of Systems Engineering and Science, Shibaura Institute of Technology, Japan (e-mail: hhase@ shibaura-it.ac.jp).
Cite: Ngoc Tam Bui and Hiroshi Hasegawa, "Training Artificial Neural Network Using Modification of Differential Evolution Algorithm," International Journal of Machine Learning and Computing vol. 5, no. 1, pp. 1-6, 2015.