Incremental construction of classifier and discriminant ensembles

被引:52
作者
Ulas, Aydin [1 ]
Semerci, Murat [2 ]
Yildiz, Olcay Taner [3 ]
Alpaydin, Ethem [1 ]
机构
[1] Bogazici Univ, Dept Comp Engn, TR-34342 Istanbul, Turkey
[2] Rensselaer Polytech Inst, Dept Comp Sci, Troy, NY 12180 USA
[3] Isik Univ, Dept Comp Engn, TR-34980 Istanbul, Turkey
关键词
Classification; Classifier fusion; Classifier ensembles; Stacking; Machine learning; Voting; Discriminant ensembles; Diversity; LINEAR COMBINATION; NEURAL-NETWORKS; FUSION; SELECT;
D O I
10.1016/j.ins.2008.12.024
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We discuss approaches to incrementally construct an ensemble. The first constructs an ensemble of classifiers choosing a subset from a larger set, and the second constructs an ensemble of discriminants, where a classifier is used for some classes only. We investigate criteria including accuracy, significant improvement, diversity, correlation, and the role of search direction. For discriminant ensembles, we test subset selection and trees. Fusion is by voting or by a linear model. Using 14 classifiers on 38 data sets. incremental search finds small, accurate ensembles in polynomial time. The discriminant ensemble uses a subset of discriminants and is simpler, interpretable, and accurate. We see that an incremental ensemble has higher accuracy than bagging and random subspace method; and it has a comparable accuracy to AdaBoost. but fewer classifiers. (C) 2009 Elsevier Inc. All rights reserved.
引用
收藏
页码:1298 / 1318
页数:21
相关论文
共 63 条