Fast neural network ensemble learning via negative-correlation data correction

被引:14
作者
Chan, ZSH [1 ]
Kasabov, N [1 ]
机构
[1] Auckland Univ Technol, KEDRI, Auckland, New Zealand
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2005年 / 16卷 / 06期
关键词
distributed computing; ensemble learning; negative correlation (NC) learning;
D O I
10.1109/TNN.2005.852859
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This letter proposes a new negative correlation (NC) learning method that is both easy to implement and has the advantages that: 1) it requires much lesser communication overhead than the standard NC method and 2) it is applicable to ensembles of heterogenous networks.
引用
收藏
页码:1707 / 1710
页数:4
相关论文
共 6 条
[1]  
[Anonymous], 1993, Artificial Neural Networks for Speech and Vision
[2]   LIMITS FOR THE PRECISION AND VALUE OF INFORMATION FROM DEPENDENT SOURCES [J].
CLEMEN, RT ;
WINKLER, RL .
OPERATIONS RESEARCH, 1985, 33 (02) :427-442
[3]   A constructive algorithm for training cooperative neural network ensembles [J].
Islam, M ;
Yao, X ;
Murase, K .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (04) :820-834
[4]   Simultaneous training of negatively correlated neural networks in an ensemble [J].
Liu, Y ;
Yao, X .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1999, 29 (06) :716-725
[5]  
Rosen B. E., 1996, Connection Science, V8, P373, DOI 10.1080/095400996116820
[6]   A tutorial on support vector regression [J].
Smola, AJ ;
Schölkopf, B .
STATISTICS AND COMPUTING, 2004, 14 (03) :199-222