An improved differential evolution algorithm in training and encoding prior knowledge into feedforward networks with application in chemistry

被引:32
作者
Chen, CW
Chen, DZ [1 ]
Cao, GZ
机构
[1] Zhejiang Univ, Dept Chem Engn, Hangzhou 310027, Peoples R China
[2] Zhejiang Univ, Coll Elect Engn, Hangzhou 310027, Peoples R China
关键词
prior knowledge; feedforward network; improved differential evolution; flip operation; Levenberg -Marquardt descent strategy; random perturbation strategy;
D O I
10.1016/S0169-7439(02)00048-5
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Prior-knowledge-based feedforward networks have shown superior performance in modeling chemical processes. In this paper, an improved differential evolution (IDEP) algorithm is proposed to encode prior knowledge simultaneously into networks in training process, With regard to monotonic prior knowledge, IDEP algorithm employs a flip operation to adjust those prior-knowledge-violating networks to conform to the monotonicity. In addition, two strategies, Levenberg-Marquardt descent (LMD) strategy and random perturbation (RP) strategy, are adopted to speed up the differential evolution (DE) in the algorithm and prevent it from being trapped by some local minimums, respectively. To demonstrate the IDEP algorithm's efficiency, we apply it to model two chemical curves with the increasing monotonicity constraint. For comparison, four network-training algorithms without prior-knowledge constraints, as well as three existing prior-knowledge-based algorithms (which have some relationship and similarities with IDEP algorithm), are employed to solve the same problems. The simulation results show that IDEP's performance is better than all other algorithms. As a conclusion, IDEP algorithm and its promising prospective will be discussed in detail at the end of this paper. (C) 2002 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:27 / 43
页数:17
相关论文
共 27 条