Efficient training and improved performance of multilayer perceptron in pattern classification

被引:138
作者
Chaudhuri, BB [1 ]
Bhattacharya, U [1 ]
机构
[1] Indian Stat Inst, Comp Vis & Pattern Recognit Unit, Calcutta 700035, W Bengal, India
关键词
backpropagation algorithm; multilayer perceptron; faster training; pattern classification;
D O I
10.1016/S0925-2312(00)00305-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In pattern recognition problems, the convergence of backpropagation training algorithm of a multilayer perceptron is slow if the concerned classes have complex decision boundary. To improve the performance, we propose a technique, which at first cleverly picks up samples near the decision boundary without actually knowing the position of decision boundary. To choose the training samples, a larger set of data with known class label is considered. For each datum, its k-neighbours are found. If the datum is near the decision boundary, then all of these k-neighbours would not come from the same class. A training set, generated using this idea, results in quick and better convergence of the training algorithm. To get more symmetric neighbours, the nearest centroid neighbourhood (Chaudhuri, Pattern Recognition Lett. 17 (1996) 11-17) is used. The performance of the technique has been tested on synthetic data as well as speech vowel data in two Indian languages. (C) 2000 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:11 / 27
页数:17
相关论文
共 18 条
[1]  
Becker S., 1988, P 1988 CONN MOD SUMM, P29
[2]  
BHATTACHARYA U, 1995, P IEEE INT C NEUR NE, P2784
[3]   A new definition of neighborhood of a point in multi-dimensional space [J].
Chaudhuri, BB .
PATTERN RECOGNITION LETTERS, 1996, 17 (01) :11-17
[4]  
DAVIS DT, 1992, INT JOINT C NEUR NET
[5]   STATISTICALLY CONTROLLED ACTIVATION WEIGHT INITIALIZATION (SCAWI) [J].
DRAGO, GP ;
RIDELLA, S .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (04) :627-631
[6]  
Hart P.E., 1973, Pattern recognition and scene analysis
[7]   INCREASED RATES OF CONVERGENCE THROUGH LEARNING RATE ADAPTATION [J].
JACOBS, RA .
NEURAL NETWORKS, 1988, 1 (04) :295-307
[8]  
Johansson E. M., 1991, International Journal of Neural Systems, V2, P291, DOI 10.1142/S0129065791000261
[9]   BENEFITS OF GAIN - SPEEDED LEARNING AND MINIMAL HIDDEN LAYERS IN BACK-PROPAGATION NETWORKS [J].
KRUSCHKE, JK ;
MOVELLAN, JR .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1991, 21 (01) :273-280
[10]  
Masters T., 1993, PRACTICAL NEURAL NET