Cost Complexity-Based Pruning of Ensemble Classifiers

被引:46
作者
Prodromidis, Andreas L. [1 ]
Stolfo, Salvatore J. [1 ]
机构
[1] Department of Computer Science, Columbia University, New York, United States
关键词
Keywords: Classifier evaluation; Credit card fraud detection; Distributed data mining; Ensembles of classifiers; Meta-learning; Pruning;
D O I
10.1007/PL00011678
中图分类号
学科分类号
摘要
In this paper we study methods that combine multiple classification models learned over separate data sets. Numerous studies posit that such approaches provide the means to efficiently scale learning to large data sets, while also boosting the accuracy of individual classifiers. These gains, however, come at the expense of an increased demand for run-time system resources. The final ensemble meta-classifier may consist of a large collection of base classifiers that require increased memory resources while also slowing down classification throughput. Here, we describe an algorithm for pruning (i.e., discarding a subset of the available base classifiers) the ensemble meta-classifier as a means to reduce its size while preserving its accuracy and we present a technique for measuring the trade-off between predictive performance and available run-time system resources. The algorithm is independent of the method used initially when computing the meta-classifier. It is based on decision tree pruning methods and relies on the mapping of an arbitrary ensemble meta-classifier to a decision tree model. Through an extensive empirical study on meta-classifiers computed over two real data sets, we illustrate our pruning algorithm to be a robust and competitive approach to discarding classification models without degrading the overall predictive performance of the smaller ensemble computed over those that remain after pruning.
引用
收藏
页码:449 / 469
页数:20
相关论文
empty
未找到相关数据