The eigenspace separation transform for neural-network classifiers

被引:19
作者
Torrieri, D [1 ]
机构
[1] USA, Res Lab, AMSRLISTA, Adelphi, MD 20783 USA
关键词
linear transform; classifier; preprocessor; data compression; dimensionality reduction; feature extractor; multilayer perceptron; radial basis function;
D O I
10.1016/S0893-6080(98)00138-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a linear transform that compresses data in a manner designed to improve the performance of a neural network used as a binary classifier. The classifier is intended to accommodate data distributions that may be non-normal, may have equal class means, may be multimodal, and have unknown a priori probabilities for the two classes. The transform, which is called the eigenspace separation transform, allows the reduction of the size of a neural network while enhancing its generalization accuracy as a binary classifier. Published by Elsevier Science Ltd.
引用
收藏
页码:419 / 427
页数:9
相关论文
共 7 条
[1]  
[Anonymous], 1994, LINEAR ALGEBRA APPL
[2]  
Bishop C. M., 1995, NEURAL NETWORKS PATT
[3]  
Fukunaga K., 1990, INTRO STAT PATTERN R
[4]  
Haykin S., 1994, NEURAL NETWORKS COMP
[5]   Fast Learning in Networks of Locally-Tuned Processing Units [J].
Moody, John ;
Darken, Christian J. .
NEURAL COMPUTATION, 1989, 1 (02) :281-294
[6]  
OJA E, 1983, SUBSPACE METHODS PAT
[7]   Mine detection using scattering parameters and an artificial neural network [J].
Plett, GL ;
Doi, T ;
Torrieri, D .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (06) :1456-1467