Bayesian trigonometric support vector classifier

被引:17
作者
Chu, W [1 ]
Keerthi, SS [1 ]
Ong, CJ [1 ]
机构
[1] Natl Univ Singapore, Dept Mech Engn, Singapore 119260, Singapore
关键词
D O I
10.1162/089976603322297368
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This letter describes Bayesian techniques for support vector classification. In particular, we propose a novel differentiable loss function, called the trigonometric loss function, which has the desirable characteristic of natural normalization in the likelihood function, and then follow standard gaussian processes techniques to set up a Bayesian framework. In this framework, Bayesian inference is used to implement model adaptation, while keeping the merits of support vector classifier, such as sparseness and convex programming. This differs from standard gaussian processes for classification. Moreover, we put forward class probability in making predictions. Experimental results on benchmark data sets indicate the usefulness of this approach.
引用
收藏
页码:2227 / 2254
页数:28
相关论文
共 23 条
[1]  
BERGER J. O., 2013, Statistical Decision Theory and Bayesian Analysis, DOI [10.1007/978-1-4757-4286-2, DOI 10.1007/978-1-4757-4286-2]
[2]  
Bishop C. M., 1995, NEURAL NETWORKS PATT
[3]   A tutorial on Support Vector Machines for pattern recognition [J].
Burges, CJC .
DATA MINING AND KNOWLEDGE DISCOVERY, 1998, 2 (02) :121-167
[4]   A LIMITED MEMORY ALGORITHM FOR BOUND CONSTRAINED OPTIMIZATION [J].
BYRD, RH ;
LU, PH ;
NOCEDAL, J ;
ZHU, CY .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1995, 16 (05) :1190-1208
[5]  
DUANE S, 1987, PHYS LETT B, V195, P2
[6]  
EVGENIOU T, 1999, 1654 AI MIT
[7]  
Fletcher R., 1981, PRACTICAL METHODS OP
[8]  
Hart, 2006, PATTERN CLASSIFICATI
[9]   Improvements to Platt's SMO algorithm for SVM classifier design [J].
Keerthi, SS ;
Shevade, SK ;
Bhattacharyya, C ;
Murthy, KRK .
NEURAL COMPUTATION, 2001, 13 (03) :637-649
[10]  
KEERTHI SS, 2002, P 19 INT C MACH LEAR, P82