A feature selection Newton method for support vector machine classification

被引:174
作者
Fung, GM [1 ]
Mangasarian, OL [1 ]
机构
[1] Univ Wisconsin, Dept Comp Sci, Madison, WI 53706 USA
基金
美国国家科学基金会;
关键词
classification; feature selection; linear programming; Newton method;
D O I
10.1023/B:COAP.0000026884.66338.df
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
A fast Newton method, that suppresses input space features, is proposed for a linear programming formulation of support vector machine classifiers. The proposed stand-alone method can handle classification problems in very high dimensional spaces, such as 28,032 dimensions, and generates a classifier that depends on very few input features, such as 7 out of the original 28,032. The method can also handle problems with a large number of data points and requires no specialized linear programming packages but merely a linear equation solver. For nonlinear kernel classifiers, the method utilizes a minimal number of kernel functions in the classifier that it generates.
引用
收藏
页码:185 / 202
页数:18
相关论文
共 27 条
[1]  
[Anonymous], 1453 U WISC COMP SCI
[2]  
[Anonymous], MATLAB US GUID
[3]  
[Anonymous], 2001, Poc. SIAM Int'l Conf. Data Mining
[4]  
Bertsekas D.P., 1999, Nonlinear Programming
[5]  
Bi J., 2003, Journal of Machine Learning Research, V3, P1229, DOI 10.1162/153244303322753643
[6]  
Bradley P. S., 1998, Machine Learning. Proceedings of the Fifteenth International Conference (ICML'98), P82
[7]  
Cherkassky V.S., 1998, LEARNING DATA CONCEP, V1st ed.
[8]   MINIMIZATION OF SC1 FUNCTIONS AND THE MARATOS EFFECT [J].
FACCHINEI, F .
OPERATIONS RESEARCH LETTERS, 1995, 17 (03) :131-137
[9]  
Fiacco A. V., 1990, Nonlinear Programming: Sequential Unconstrained Minimization Techniques
[10]   Finite Newton method for Lagrangian support vector machine classification [J].
Fung, G ;
Mangasarian, OL .
NEUROCOMPUTING, 2003, 55 (1-2) :39-55