General and efficient multisplitting of numerical attributes

被引:114
作者
Elomaa, T
Rousu, J
机构
[1] Univ Helsinki, Dept Comp Sci, FIN-00014 Helsinki, Finland
[2] VTT Biotechnol & Food Res, FIN-02044 VTT, Finland
基金
芬兰科学院;
关键词
supervised learning; numerical attributes; optimal partitions; evaluation functions;
D O I
10.1023/A:1007674919412
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Often in supervised learning numerical attributes require special treatment and do not fit the learning scheme as well as one could hope. Nevertheless, they are common in practical tasks and, therefore, need to be taken into account. We characterize the well-behavedness of an evaluation function, a property that guarantees the optimal multi-partition of an arbitrary numerical domain to be defined on boundary points. Well-behavedness reduces the number of candidate cut points that need to be examined in multisplitting numerical attributes. Many commonly used attribute evaluation functions possess this property; we demonstrate that the cumulative functions Information Gain and Training Set Error as well as the non-cumulative functions Gain Ratio and Normalized Distance Measure are all well-behaved. We also devise a method of finding optimal multisplits efficiently by examining the minimum number of boundary point combinations that is required to produce partitions which are optimal with respect to a cumulative and well-behaved evaluation function. Our empirical experiments validate the utility of optimal multisplitting: it produces constantly better partitions than alternative approaches do and it only requires comparable time. In top-down induction of decision trees the choice of evaluation function has a more decisive effect on the result than the choice of partitioning strategy; optimizing the value of most common attribute evaluation functions does not raise the accuracy of the produced decision trees. In our tests the construction time using optimal multisplitting was, on the average, twice that required by greedy multisplitting, which in its part required on the average twice the time of binary splitting.
引用
收藏
页码:201 / 244
页数:44
相关论文
共 43 条
[1]  
[Anonymous], 1995, P 14 INT JOINT C ART
[2]  
[Anonymous], P 10 NAT C ART INT S
[3]  
[Anonymous], 1993, P 13 INT JOINT C ART
[4]  
Auer P., 1995, Machine Learning. Proceedings of the Twelfth International Conference on Machine Learning, P21
[5]  
AUER P, 1997, UNPUB OPTIMAL SPLITS
[6]  
BIRKENDORF A, 1997, P 3 EUR C COMP LEARN, P198
[7]   Technical note: Some properties of splitting criteria [J].
Breiman, L .
MACHINE LEARNING, 1996, 24 (01) :41-47
[8]  
Brodley C. E., 1995, Machine Learning. Proceedings of the Twelfth International Conference on Machine Learning, P73
[9]   A FURTHER COMPARISON OF SPLITTING RULES FOR DECISION-TREE INDUCTION [J].
BUNTINE, W ;
NIBLETT, T .
MACHINE LEARNING, 1992, 8 (01) :75-85
[10]  
CATLETT J, 1991, LECT NOTES ARTIF INT, V482, P164, DOI 10.1007/BFb0017012