Additive logistic regression: A statistical view of boosting - Rejoinder

被引:4183
作者
Friedman, J [1 ]
Hastie, T [1 ]
Tibshirani, R [1 ]
机构
[1] Stanford Univ, Dept Stat, Stanford, CA 94305 USA
关键词
Classification; Machine learning; Nonparametric estimation; Stagewise fitting; Tree;
D O I
10.1214/aos/1016218223
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Boosting is one of the most important recent developments in classification methodology. Boosting works by sequentially applying a classification algorithm to reweighted versions of the training data and then taking a weighted majority vote of the sequence of classifiers thus produced. For many classification algorithms, this simple strategy results in dramatic improvements in performance. We show that this seemingly mysterious phenomenon can be understood in terms of well-known statistical principles, namely additive modeling and maximum likelihood. For the two-class problem, boosting can be viewed as an approximation to additive modeling on the logistic scale using maximum Bernoulli likelihood as a criterion. We develop more direct approximations and show that they exhibit nearly identical results to boosting. Direct multiclass generalizations based on multinomial likelihood are derived that exhibit performance comparable to other recently proposed multiclass generalizations of boosting in most situations, and far superior in some. We suggest a minor modification to boosting that can reduce computation, often by factors of 10 to 50. Finally, we apply these insights to produce an alternative formulation of boosting decision trees. This approach, based on best-first truncated tree induction, often leads to better performance, and can provide interpretable descriptions of the aggregate decision rule. It is also much faster computationally, making it more suitable to large-scale data mining applications.
引用
收藏
页码:400 / 407
页数:8
相关论文
共 12 条
[1]  
[Anonymous], 1999, P 12 ANN C COMP LEAR
[2]  
[Anonymous], 1999, STOCHASTIC GRADIENT
[3]  
BREIMAN L, 1999, 547 U CAL DEPT STAT
[4]  
Freund Y, 1999, MACHINE LEARNING, PROCEEDINGS, P124
[5]   MULTIVARIATE ADAPTIVE REGRESSION SPLINES [J].
FRIEDMAN, JH .
ANNALS OF STATISTICS, 1991, 19 (01) :1-67
[6]  
FRIEDMAN JH, 1999, IN PRESS J COMPUT GR
[7]  
FRIEDMAN JH, 1999, IN PRESS ANN STAT
[8]  
Grove A.J., 1998, P 15 NAT C ART INT
[9]  
QUINLAN JR, 1996, LECT NOTES COMPUTER, V1160, P143
[10]  
Ratsch G., 2000, MACH LEARN, P1