Feature extraction approaches based on matrix pattern: MatPCA and MatFLDA

被引:70
作者
Chen, SC [1 ]
Zhu, YL
Zhang, DQ
Yang, JY
机构
[1] Nanjing Univ Aeronaut & Astronaut, Dept Comp Sci & Engn, Nanjing 210016, Jiangsu, Peoples R China
[2] Nanjing Univ Sci & Technol, Dept Comp Sci, Nanjing 210094, Peoples R China
关键词
pattern representation; principal component analysis (PCA); Fisher linear discriminant analysis (FLDA); vector representation; matrix representation; feature extraction; Pattern recognition;
D O I
10.1016/j.patrec.2004.10.009
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Principle component analysis (PCA) and Fisher linear discriminant analysis (FLDA), as two popular feature extraction approaches in Pattern recognition and data analysis, extract so-needed features directly based on vector patterns, i.e., before applying them, any non-vector pattern such as an image is first vectorized into a vector pattern by some technique like concatenation. However, such a vectorization has been proved not to be beneficial for image recognition due to consequences of both the algebraic feature extraction approach and 2DPCA. In this paper, inspired by the above two approaches, we try an opposite direction to extract features for any vector pattern by first matrixizing it into a matrix pattern and then applying the matrixized versions of PCA and FLDA, MatPCA and MatFLDA, to the pattern. MatFLDA uses, in essence, the same principle as the algebraic feature extraction approach and is constructed in terms of similar objective function to FLDA while MatPCA uses a minimization of the reconstructed error for the training samples like PCA to obtain a set of projection vectors, which is somewhat different derivation from 2DPCA despite of equivalence. Finally experiments on 10 publicly obtainable datasets show that both MatPCA and MatFLDA gain performance improvement in different degrees, respectively, on 7 and 5 datasets and at the same time, the computational burden of extracting features is largely reduced. In addition, it is noteworthy that the proposed approaches are still linear and the promotion of classification accuracy does not result from commonly-used non-linearization for the original linear approaches but from the simple matrixization. Furthermore, another prominent merit of matrixizing FLDA is that it can naturally break down the notorious rank limitation, that is, the number of discriminating vectors able to be found is bounded by C - 1 for C class problem, and at the same time no additional computational cost is introduced. (c) 2004 Elsevier B.V. All rights reserved.
引用
收藏
页码:1157 / 1167
页数:11
相关论文
共 19 条
[1]  
BEYMER D, 1996, SCIENCE, V272, P905
[2]  
Blake C.L., 1998, UCI repository of machine learning databases
[3]  
CHEN SC, 2002, 0010 U AER ASTR DEP
[4]  
CLARKE J, 1985, TRANSFORM CODING IMA
[5]   FAST FOURIER-TRANSFORMS - A TUTORIAL REVIEW AND A STATE-OF-THE-ART [J].
DUHAMEL, P ;
VETTERLI, M .
SIGNAL PROCESSING, 1990, 19 (04) :259-299
[6]   REGULARIZED DISCRIMINANT-ANALYSIS [J].
FRIEDMAN, JH .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1989, 84 (405) :165-175
[7]  
Fukunaga K., 1990, INTRO STAT PATTERN R
[8]   PENALIZED DISCRIMINANT-ANALYSIS [J].
HASTIE, T ;
BUJA, A ;
TIBSHIRANI, R .
ANNALS OF STATISTICS, 1995, 23 (01) :73-102
[9]   ALGEBRAIC FEATURE-EXTRACTION OF IMAGE FOR RECOGNITION [J].
HONG, ZQ .
PATTERN RECOGNITION, 1991, 24 (03) :211-219
[10]   ALGEBRAIC FEATURE-EXTRACTION FOR IMAGE RECOGNITION BASED ON AN OPTIMAL DISCRIMINANT CRITERION [J].
LIU, K ;
CHENG, YQ ;
YANG, JY .
PATTERN RECOGNITION, 1993, 26 (06) :903-911