Optimizing connection weights in neural networks using the whale optimization algorithm

被引:665
作者
Aljarah, Ibrahim [1 ]
Faris, Hossam [1 ]
Mirjalili, Seyedali [2 ]
机构
[1] Univ Jordan, King Abdullah II Sch Informat Technol, Amman, Jordan
[2] Griffith Univ, Sch Informat & Commun Technol, Brisbane, Qld 4111, Australia
关键词
Optimization; Whale optimization algorithm; WOA; Multilayer perceptron; MLP; Training neural network; Evolutionary algorithm; PARTICLE SWARM OPTIMIZATION; DIFFERENTIAL EVOLUTION; GENETIC ALGORITHM; GLOBAL OPTIMIZATION; BACKPROPAGATION; IMPLEMENTATION;
D O I
10.1007/s00500-016-2442-1
中图分类号
TP18 [人工智能理论];
学科分类号
140502 [人工智能];
摘要
The learning process of artificial neural networks is considered as one of the most difficult challenges in machine learning and has attracted many researchers recently. The main difficulty of training a neural network is the nonlinear nature and the unknown best set of main controlling parameters (weights and biases). The main disadvantages of the conventional training algorithms are local optima stagnation and slow convergence speed. This makes stochastic optimization algorithm reliable alternative to alleviate these drawbacks. This work proposes a new training algorithm based on the recently proposed whale optimization algorithm (WOA). It has been proved that this algorithm is able to solve a wide range of optimization problems and outperform the current algorithms. This motivated our attempts to benchmark its performance in training feedforward neural networks. For the first time in the literature, a set of 20 datasets with different levels of difficulty are chosen to test the proposed WOA-based trainer. The results are verified by comparisons with back-propagation algorithm and six evolutionary techniques. The qualitative and quantitative results prove that the proposed trainer is able to outperform the current algorithms on the majority of datasets in terms of both local optima avoidance and convergence speed.
引用
收藏
页码:1 / 15
页数:15
相关论文
共 52 条
[31]
Meng XH, 2014, IEEE INT C NETW SENS, P548, DOI 10.1109/ICNSC.2014.6819685
[32]
The Whale Optimization Algorithm [J].
Mirjalili, Seyedali ;
Lewis, Andrew .
ADVANCES IN ENGINEERING SOFTWARE, 2016, 95 :51-67
[33]
How effective is the Grey Wolf optimizer in training multi-layer perceptrons [J].
Mirjalili, Seyedali .
APPLIED INTELLIGENCE, 2015, 43 (01) :150-161
[34]
Let a biogeography-based optimizer train your Multi-Layer Perceptron [J].
Mirjalili, Seyedali ;
Mirjalili, Seyed Mohammad ;
Lewis, Andrew .
INFORMATION SCIENCES, 2014, 269 :188-209
[35]
Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm [J].
Mirjalili, SeyedAli ;
Hashim, Siti Zaiton Mohd ;
Sardroudi, Hossein Moradian .
APPLIED MATHEMATICS AND COMPUTATION, 2012, 218 (22) :11125-11137
[36]
A survey: Ant Colony Optimization based recent research and implementation on several engineering domain [J].
Mohan, B. Chandra ;
Baskaran, R. .
EXPERT SYSTEMS WITH APPLICATIONS, 2012, 39 (04) :4618-4627
[37]
Panchal G., 2011, International Journal of Computer Theory and Engineering, V3, P332, DOI [DOI 10.7763/IJCTE.2011.V3.328, 10.7763/IJCTE.2011.V3.328]
[38]
Training feedforward neural networks with dynamic particle swarm optimisation [J].
Rakitianskaia, A. S. ;
Engelbrecht, A. P. .
SWARM INTELLIGENCE, 2012, 6 (03) :233-270
[39]
Flood flow forecasting using ANN, ANFIS and regression models [J].
Rezaeianzadeh, M. ;
Tabari, H. ;
Yazdi, A. Arabi ;
Isik, S. ;
Kalin, L. .
NEURAL COMPUTING & APPLICATIONS, 2014, 25 (01) :25-37
[40]
Deep learning in neural networks: An overview [J].
Schmidhuber, Juergen .
NEURAL NETWORKS, 2015, 61 :85-117