A tutorial on support vector regression

被引:8503
作者
Smola, AJ [1 ]
Schölkopf, B
机构
[1] Australian Natl Univ, RSISE, Canberra, ACT 0200, Australia
[2] Max Planck Inst Biol Cybernet, D-72076 Tubingen, Germany
关键词
machine learning; support vector machines; regression estimation;
D O I
10.1023/B:STCO.0000035301.49549.88
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing with large datasets. Finally, we mention some modifications and extensions that have been applied to the standard SV algorithm, and discuss the aspect of regularization from a SV perspective.
引用
收藏
页码:199 / 222
页数:24
相关论文
共 139 条
[1]  
AIZERMAN MA, 1965, AUTOMAT REM CONTR+, V25, P821
[2]  
[Anonymous], LNCS, DOI DOI 10.1007/BFB0020283
[3]  
[Anonymous], 1961, Adaptive Control Processes: a Guided Tour, DOI DOI 10.1515/9781400874668
[4]  
[Anonymous], 2001, NV2TR1998030 MATH WO
[5]  
[Anonymous], 1990, SPLINE MODELS OBSERV
[6]  
[Anonymous], 8320R SOL STANF U
[7]  
[Anonymous], 1982, JACKNIFE BOOTSTRAP O
[8]  
[Anonymous], P 2 INT C COMP VIS
[9]  
[Anonymous], 1982, ESTIMATION DEPENDENC
[10]  
[Anonymous], 1998, PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS, PERSPECTIVES IN NEURAL COMPUTING