Adaptive competitive learning neural networks

被引:4
作者
Abas, Ahmed R. [1 ,2 ,3 ]
机构
[1] Umm Al Qura Univ, Dept Comp Sci, Makka Al Mukarrama, Lith, Saudi Arabia
[2] Zagazig Univ, Fac Computers & Informat, Dept Comp Sci, Zagazig, Egypt
[3] Cairo Univ, Fac Computers & Informat, Cairo, Egypt
关键词
Adaptive competitive learning neural networks; Conscience learning; Dead neurons; Number of clusters; Robust clustering;
D O I
10.1016/j.eij.2013.08.001
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, the adaptive competitive learning (ACL) neural network algorithm is proposed. This neural network not only groups similar input feature vectors together but also determines the appropriate number of groups of these vectors. This algorithm uses a new proposed criterion referred to as the ACL criterion. This criterion evaluates different clustering structures produced by the ACL neural network for an input data set. Then, it selects the best clustering structure and the corresponding network architecture for this data set. The selected structure is composed of the minimum number of clusters that are compact and balanced in their sizes. The selected network architecture is efficient, in terms of its complexity, as it contains the minimum number of neurons. Synaptic weight vectors of these neurons represent well-separated, compact and balanced clusters in the input data set. The performance of the ACL algorithm is evaluated and compared with the performance of a recently proposed algorithm in the literature in clustering an input data set and determining its number of clusters. Results show that theACL algorithm is more accurate and robust in both determining the number of clusters and allocating input feature vectors into these clusters than the other algorithm especially with data sets that are sparsely distributed. (C) 2013 Production and hosting by Elsevier B.V. on behalf of Faculty of Computers and Information, Cairo University.
引用
收藏
页码:183 / 194
页数:12
相关论文
共 24 条
[1]   An algorithm for unsupervised learning and optimization of finite mixture models [J].
Abas, Ahmed R. .
EGYPTIAN INFORMATICS JOURNAL, 2011, 12 (01) :19-27
[2]   On determining efficient finite mixture models with compact and essential components for clustering data [J].
Abas, Ahmed R. .
EGYPTIAN INFORMATICS JOURNAL, 2013, 14 (01) :79-88
[3]  
[Anonymous], 2003, STAT PATTERN RECOGNI
[4]   Competitive neural trees for pattern classification [J].
Behnke, S ;
Karayiannis, NB .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1998, 9 (06) :1352-1369
[5]  
Biernacki C, 1997, COMPUTING SCI STAT, V29, P451
[6]  
Budura G, 2006, ELECT ENERGETICS J, V19, P261
[7]   OPTIMAL ADAPTIVE K-MEANS ALGORITHM WITH DYNAMIC ADJUSTMENT OF LEARNING RATE [J].
CHINRUNGRUENG, C ;
SEQUIN, CH .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (01) :157-169
[8]  
Cover T. M., 2006, ELEMENTS INFORM THEO, DOI [DOI 10.1002/047174882X, DOI 10.1002/047174882X.CH5]
[9]  
Demuth H., 2002, NEURAL NETWORK TOOLB
[10]  
DESIENO D, 1988, P IEEE INT C NEURAL, V1, P117