Relationship between support vector set and kernel functions in SVM

被引:50
作者
Zhang, L [1 ]
Zhang, B
机构
[1] Anhui Univ, Artificial Intelligence Inst, Anhua 230039, Peoples R China
[2] Tsinghua Univ, Dept Comp Sci, Beijing 100084, Peoples R China
[3] Tsinghua Univ, State Key Lab Intelligent Technol & Syst, Beijing 100084, Peoples R China
基金
中国国家自然科学基金;
关键词
support vector machine (SVM); support vector; kernel function; constructive learning theory; cover; boundary;
D O I
10.1007/BF02948823
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Based on a constructive learning approach, covering algorithms, we investigate the relationship between support vector sets and kernel functions in support vector machines (SVM). An interesting result is obtained. That is, in the linearly non-separable case, any sample of a given sample set K can become a support vector under a certain kernel function. The result shows that when the sample set K is linearly non-separable, although the chosen kernel function satisfies Mercer's condition its corresponding support vector set is not necessarily the subset of K that plays a crucial role in classifying K. For a given sample set, what is the subset that plays the crucial role in classification? In order to explore the problem, a new concept, boundary or boundary points, is defined and its properties are discussed. Given a sample set K, we show that the decision functions for classifying the boundary points of K are the same as that for classifying the K itself. And,the boundary points of K only depend on K and the structure of the space at which K is located and independent of the chosen approach for finding the boundary. Therefore, the boundary point set may become the subset of K that plays a crucial role in classification. These results are of importance to understand the principle of the support vector machine (SVM) and to develop new learning algorithms.
引用
收藏
页码:549 / 555
页数:7
相关论文
共 9 条
[1]  
EISENBERG M, 1974, TOPOLOGY
[2]  
RAMACHER U, P IJCNN 89 WASH DC, V2, P147
[3]  
RUJAN P, P IJCNN 89 WASH DC, V2, P105
[4]  
Vapnik V, 1999, NATURE STAT LEARNING
[5]  
Vapnik V., 1998, STAT LEARNING THEORY, V1, P2
[6]   An overview of statistical learning theory [J].
Vapnik, VN .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (05) :988-999
[7]  
Zhang HG, 2001, EPISODES, V24, P2
[8]   A geometrical representation of McCulloch-Pitts neural model and its applications [J].
Zhang, L ;
Zhang, B .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (04) :925-929
[9]  
Zhang Ling, 1999, Journal of Software, V10, P737