Boosting methods for regression

被引:123
作者
Duffy, N [1 ]
Helmbold, D [1 ]
机构
[1] Univ Calif Santa Cruz, Dept Comp Sci, Santa Cruz, CA 95064 USA
基金
美国国家科学基金会;
关键词
learning; boosting; arcing; ensemble methods; regression; gradient descent;
D O I
10.1023/A:1013685603443
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we examine ensemble methods for regression that leverage or "boost" base regressors by iteratively calling them on modified samples. The most successful leveraging algorithm for classification is AdaBoost, an algorithm that requires only modest assumptions on the base learning method for its strong theoretical guarantees. We present several gradient descent leveraging algorithms for regression and prove AdaBoost-style bounds on their sample errors using intuitive assumptions on the base learners. We bound the complexity of the regression functions produced in order to derive PAC-style bounds on their generalization errors. Experiments validate our theoretical results.
引用
收藏
页码:153 / 200
页数:48
相关论文
共 33 条
[31]   Cell and molecular biological challenges of ICSI: ART before science? [J].
Schatten, G ;
Hewitson, L ;
Simerly, C ;
Sutovsky, P ;
Huszar, G .
JOURNAL OF LAW MEDICINE & ETHICS, 1998, 26 (01) :29-37
[32]   A THEORY OF THE LEARNABLE [J].
VALIANT, LG .
COMMUNICATIONS OF THE ACM, 1984, 27 (11) :1134-1142
[33]  
Vapnik V., 1998, Statistical learning theory