一种用于BP神经网络训练的改进遗传算法

被引:4
作者
周祥
陈丙珍
何小荣
机构
[1] 清华大学化学工程系
关键词
BP神经网络; 局部极小点; 遗传算法; 评价函数; 变异模型;
D O I
暂无
中图分类号
TQ018 [数学模型及放大];
学科分类号
摘要
The training process of Back Propagation Neural Network (BPNN) is easily converged at a local minimum, which slows the training process sharply.In this paper, an analysis is given to the chief formative reason of local minimum, and an improved Genetic Algorithm (GA) is introduced to overcome local minimum.Most BPNNs use Sigmoid function as the transfer function of network nodes, this paper indicates that the flat characteristic of Sigmoid function results in the formation of local minimum.In the improved GA, pertinent modifications are made to the evaluation function and the mutation model.The evaluation of solution is associated with both values of error function and gradient model corresponding to the certain solution, so that solutions away from local minimum are highly evaluated.The sensitivity of error function to network parameter is imported to form a self-adapting mutation model, which is powerful to diminish error function.Both modifications help to drive solutions out of local minimum.A case study of a real industrial process shows the advantage of the improved GA to overcome local minimum and to accelerate the training process.
引用
收藏
页码:925 / 927
页数:3
相关论文
共 4 条