Combining classifiers with meta decision trees

被引:133
作者
Todorovski, L [1 ]
Dzeroski, S [1 ]
机构
[1] Jozef Stefan Inst, Dept Intelligent Syst, Ljubljana, Slovenia
关键词
ensembles of classifiers; meta-level learning; combining classifiers; stacking; decision trees;
D O I
10.1023/A:1021709817809
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The paper introduces meta decision trees (MDTs), a novel method for combining multiple classifiers. Instead of giving a prediction, MDT leaves specify which classifier should be used to obtain a prediction. We present an algorithm for learning MDTs based on the C4.5 algorithm for learning ordinary decision trees (ODTs). An extensive experimental evaluation of the new algorithm is performed on twenty-one data sets, combining classifiers generated by five learning algorithms: two algorithms for learning decision trees, a rule learning algorithm, a nearest neighbor algorithm and a naive Bayes algorithm. In terms of performance, stacking with MDTs combines classifiers better than voting and stacking with ODTs. In addition, the MDTs are much more concise than the ODTs and are thus a step towards comprehensible combination of multiple classifiers. MDTs also perform better than several other approaches to stacking.
引用
收藏
页码:223 / 249
页数:27
相关论文
共 22 条
[1]  
Ali KM, 1996, MACH LEARN, V24, P173, DOI 10.1007/BF00058611
[2]  
ALI KM, 1996, AAAI 96 WORKSH INT M
[3]  
Blake C.L., 1998, UCI repository of machine learning databases
[4]  
Brazdil P., 1994, MACHINE LEARNING NEU
[5]   Bagging predictors [J].
Breiman, L .
MACHINE LEARNING, 1996, 24 (02) :123-140
[6]   On the Accuracy of Meta-learning for Scalable Data Mining [J].
Chan P.K. ;
Stolfo S.J. .
Journal of Intelligent Information Systems, 1997, 8 (1) :5-28
[7]  
Clark P., 1991, P 5 EUR WORK SESS LE, P151, DOI DOI 10.1007/BFB0017011
[8]  
Dietterich TG, 1997, AI MAG, V18, P97
[9]  
Freund Y., 1996, P 13 INT C MACH LEAR
[10]   Cascade generalization [J].
Gama, J ;
Brazdil, P .
MACHINE LEARNING, 2000, 41 (03) :315-343