The theoretical analysis of FDA and applications

被引:8
作者
Tao, Q [1 ]
Wu, GW
Wang, J
机构
[1] Chinese Acad Sci, Inst Automat, Key Lab Complex Syst & Intelligence Sci, Beijing 100080, Peoples R China
[2] Chinese Acad Sci, Inst Comp Technol, Bioinformat Res Grp, Beijing 100080, Peoples R China
基金
中国国家自然科学基金;
关键词
classification; Fisher discriminant analysis; support vector machines; LS-SVM; regression;
D O I
10.1016/j.patcog.2005.09.018
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Representation and embedding are usually the two necessary phases in designing a classifier. Fisher discriminant analysis (FDA) is regarded as seeking a direction for which the projected samples are well separated. In this paper, we analyze FDA in terms of representation and embedding. The main contribution is that we prove that the general framework of FDA is based on the simplest and most intuitive FDA with zero within-class variance and therefore the mechanism of FDA is clearly illustrated. Based on our analysis, e-insensitive SVM regression can be viewed as a soft FDA with epsilon-insensitive within-class variance and L-1 norm penalty. To verify this viewpoint, several real classification experiments are conducted to demonstrate that the performance of the regression-based classification technique is comparable to regular FDA and SVM. (c) 2005 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
引用
收藏
页码:1199 / 1204
页数:6
相关论文
共 17 条
[1]   Generalized discriminant analysis using a kernel approach [J].
Baudat, G ;
Anouar, FE .
NEURAL COMPUTATION, 2000, 12 (10) :2385-2404
[2]  
CORTES C, 1995, MACH LEARN, V20, P273, DOI 10.1023/A:1022627411411
[3]  
Cristianini N., 2000, Intelligent Data Analysis: An Introduction, DOI 10.1017/CBO9780511801389
[4]  
Hart, 2006, PATTERN CLASSIFICATI
[5]   Face recognition using kernel direct discriminant analysis algorithms [J].
Lu, JW ;
Plataniotis, KN ;
Venetsanopoulos, AN .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (01) :117-126
[6]   Structural risk minimization over data-dependent hierarchies [J].
Shawe-Taylor, J ;
Bartlett, PL ;
Williamson, RC ;
Anthony, M .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1998, 44 (05) :1926-1940
[7]   On the generalization of soft margin algorithms [J].
Shawe-Taylor, J ;
Cristianini, N .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2002, 48 (10) :2721-2735
[8]  
Smola A. J., 2002, Learning With Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
[9]   Least squares support vector machine classifiers [J].
Suykens, JAK ;
Vandewalle, J .
NEURAL PROCESSING LETTERS, 1999, 9 (03) :293-300
[10]   A support vector machine formulation to PCA analysis and its kernel version [J].
Suykens, JAK ;
Van Gestel, T ;
Vandewalle, J ;
De Moor, B .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (02) :447-450