A HYBRID ALGORITHM FOR FINDING THE GLOBAL MINIMUM OF ERROR FUNCTION OF NEURAL NETWORKS AND ITS APPLICATIONS

被引:48
作者
BABA, N [1 ]
MOGAMI, Y [1 ]
KOHZAKI, M [1 ]
SHIRAISHI, Y [1 ]
YOSHIDA, Y [1 ]
机构
[1] UNIV TOKUSHIMA,TOKUSHIMA 770,JAPAN
关键词
NEURAL NETWORKS; GLOBAL MINIMUM; HYBRID ALGORITHM; TOTAL ERROR FUNCTION; BP METHOD; RANDOM OPTIMIZATION METHOD; FASTER CONVERGENCE; CONVERGENCE WITH PROBABILITY-1;
D O I
10.1016/0893-6080(94)90006-X
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Back propagation has often been applied to adapt artificial neural networks for various pattern classification problems. However, an important limitation of this method is that it sometimes fails to find a global minimum of the total error function of the neural networks. In this article, a hybrid algorithm that combines the modified back-propagation method and the random optimization method is proposed to find the global minimum of the total error function of a neural network in a small number of steps. It is shown that this hybrid algorithm ensures convergence to a global minimum with probability 1 in a compact region of a weight vector space. Further, the results of several computer simulations dealing with the problems of forecasting air pollution density, forecasting stock prices, and determining the octane rating in gasoline blending are given.
引用
收藏
页码:1253 / 1265
页数:13
相关论文
共 18 条