A dempster-shafer theoretic framework for boosting based ensemble design

被引:8
作者
Altinçay, H [1 ]
机构
[1] Eastern Mediterranean Univ, Dept Comp Engn, KKTC, TR-10 Gazimagusa, Mersin, Turkey
关键词
evidential pattern classification; classifier ensembles; dynamic classifier combination; sample neighborhood information; boosting;
D O I
10.1007/s10044-005-0010-x
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Training set resampling based ensemble design techniques are successfully used to reduce the classification errors of the base classifiers. Boosting is one of the techniques used for this purpose where each training set is obtained by drawing samples with replacement from the available training set according to a weighted distribution which is modified for each new classifier to be included in the ensemble. The weighted resampling results in a classifier set, each being accurate in different parts of the input space mainly specified the sample weights. In this study, a dynamic integration of boosting based ensembles is proposed so as to take into account the heterogeneity of the input sets. An evidence-theoretic framework is developed for this purpose so as to take into account the weights and distances of the neighboring training samples in both training and testing boosting based ensembles. The effectiveness of the proposed technique is compared to the AdaBoost algorithm using three different base classifiers.
引用
收藏
页码:287 / 302
页数:16
相关论文
共 45 条
[1]   A new technique for combining multiple classifiers using the Dempster-Shafer theory of evidence [J].
Al-Ani, M ;
Deriche, M .
JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2002, 17 :333-361
[2]  
[Anonymous], 1999, P 16 INT JOINT C ART
[3]  
[Anonymous], 2001, EFFECT CLASS DISTRIB
[4]   On the Dempster-Shafer evidence theory and non-hierarchical aggregation of belief structures [J].
Bhattacharya, P .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART A-SYSTEMS AND HUMANS, 2000, 30 (05) :526-536
[5]  
Blake C.L., 1998, UCI repository of machine learning databases
[6]  
BLAYLOCK N, 2004, P 14 INT C AUT PLANN
[7]  
BLOCH I, 1997, P 10 BRAZ S COMP GRA
[8]   Bagging predictors [J].
Breiman, L .
MACHINE LEARNING, 1996, 24 (02) :123-140
[9]  
CATTANEO MEG, 2003, P 3 INT S IMPR PROB
[10]  
DASILVA WT, 1992, INT J APPROX REASON, V7, P73