Theoretically optimal parameter choices for support vector regression machines with noisy input

被引:16
作者
Wang, ST [1 ]
Zhu, JG
Chung, FL
Lin, Q
Hu, DW
机构
[1] So Yangtze Univ, Sch Informat Engn, Wuxi, Peoples R China
[2] Nanjing Univ Sci & Tech, Dept Comp Sci & Engn, Nanjing, Peoples R China
[3] HongKong Polytech Univ, Dept Comp, Hong Kong, Hong Kong, Peoples R China
[4] Natl Def Univ Sci & Tech, Sch Automat, Changsha, Peoples R China
[5] Chinese Acad Sci, Inst Software, Comp Sci Lab, Beijing, Peoples R China
关键词
regularized linear regression; support vectors; Huber loss functions; norm-r loss functions;
D O I
10.1007/s00500-004-406-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the evidence framework, the regularized linear regression model can be explained as the corresponding MAP problem in this paper, and the general dependency relationships that the optimal parameters in this model with noisy input should follow is then derived. The support vector regression machines Huber-SVR and Norm-r r-SVR are two typical examples of this model and their optimal parameter choices are paid particular attention. It turns out that with the existence of the typical Gaussian noisy input, the parameter mu in Huber-SVR has the linear dependency with the input noise, and the parameter r in the r-SVR has the inversely proportional to the input noise. The theoretical results here will be helpful for us to apply kernel-based regression techniques effectively in practical applications.
引用
收藏
页码:732 / 741
页数:10
相关论文
共 15 条
  • [1] [Anonymous], 2001, Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics
  • [2] CHERKASSKY V, 2003, IN PRESS NEURAL NETW
  • [3] Cristianini N., 2000, Intelligent Data Analysis: An Introduction, DOI 10.1017/CBO9780511801389
  • [4] A probabilistic framework for SVM regression and error bar estimation
    Gao, JB
    Gunn, SR
    Harris, CJ
    Brown, M
    [J]. MACHINE LEARNING, 2002, 46 (1-3) : 71 - 89
  • [5] Linear dependency between ε and-the input noise in ε-support vector regression
    Kwok, JT
    Tsang, IW
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (03): : 544 - 553
  • [6] SMOLA SJ, 1998, NC2TR1998030 ROYAL H
  • [7] SMOLA SJ, 1998, P INT C ART NEUR NET
  • [8] Vapnik V., 1998, STAT LEARNING THEORY, V1, P2
  • [9] WANG S, UNPUB ENG SCI CHINA
  • [10] WANG S, IN PRESS INT J SOFT