A FURTHER COMPARISON OF SPLITTING RULES FOR DECISION-TREE INDUCTION

被引:93
作者
BUNTINE, W [1 ]
NIBLETT, T [1 ]
机构
[1] TURING INST,GLASGOW G1 2AD,SCOTLAND
关键词
DECISION TREES; INDUCTION; NOISY DATA; COMPARATIVE STUDIES;
D O I
10.1023/A:1022686419106
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One approach to learning classification rules from examples is to build decision trees. A review and comparison paper by Mingers (Mingers, 1989) looked at the first stage of tree building, which uses a "splitting rule" to grow trees with a greedy recursive partitioning algorithm. That paper considered a number of different measures and experimentally examined their behavior on four domains. The main conclusion was that a random splitting rule does not significantly decrease classificational accuracy. This note suggests an alternative experimental method and presents additional results on further domains. Our results indicate that random splitting leads to increased error. These results are at variance with those presented by Mingers.
引用
收藏
页码:75 / 85
页数:11
相关论文
共 15 条
  • [1] Breiman L., 1984, CLASSIFICATION REGRE
  • [2] BUNTINE W, 1989, 6TH P INT MACH LEARN
  • [3] Cestnik B., 1987, PROGR MACHINE LEARNI, P31
  • [4] Clark P., 1989, Machine Learning, V3, P261, DOI 10.1023/A:1022641700528
  • [5] FISHER DH, 1989, 11TH P INT JOINT C A, P788
  • [6] The use of multiple measurements in taxonomic problems
    Fisher, RA
    [J]. ANNALS OF EUGENICS, 1936, 7 : 179 - 188
  • [7] Mingers J., 1989, Machine Learning, V3, P319, DOI 10.1007/BF00116837
  • [8] MOONEY R, 1989, 11TH IJCAI 89 INT JO, P775
  • [9] Quinlan J. R., 1986, Machine Learning, V1, P81, DOI 10.1007/BF00116251
  • [10] QUINLAN JR, 1988, KNOWL ACQUIS, P239