AN ANALYSIS OF THE GLVQ ALGORITHM

被引:19
作者
GONZALEZ, AI
GRANA, M
DANJOU, A
机构
[1] Dpto. CCIA, Universidad Pais Vasco/EHU Apartado 649
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1995年 / 6卷 / 04期
关键词
D O I
10.1109/72.392266
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Generalized learning vector quantization (GLVQ) has bees proposed in [5] as a generalization of the simple competitive learning (SCL) algorithm. The main argument of GLVQ proposal is its superior insensitivity to the initial values of the weights (codevectors). In this letter we show that the distinctive characteristics of the definition of GLVQ disappear outside a small domain of applications. GLVQ becomes identical to SCL when either the number of codevectors grows or the size of the input space is large. Besides that, the behavior of GLVQ is inconsistent for problems defined on very small scale input spaces. The adaptation rules fluctuate between performing descent and ascent searches on the gradient of the distortion function.
引用
收藏
页码:1012 / 1016
页数:5
相关论文
共 7 条
  • [1] [Anonymous], 1991, INTRO THEORY NEURAL, DOI DOI 10.1201/9780429499661
  • [2] [Anonymous], 1992, NEURAL NETWORKS FUZZ
  • [3] KOHONEN T, 1992, THEORY APPLICATIONS
  • [4] Kohonen T., 1989, SELF ORG ASSOCIATIVE, V3rd
  • [5] GENERALIZED CLUSTERING NETWORKS AND KOHONEN SELF-ORGANIZING SCHEME
    PAL, NR
    BEZDEK, JC
    TSAO, ECK
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1993, 4 (04): : 549 - 557
  • [6] TOMASINI L, 1993, THESIS ECOLE NATIONA
  • [7] YAIFR E, IEEE T SIGNAL PROCES, V40, P294