Convergence acceleration of the Hopfield neural network by optimizing integration step sizes

被引:15
作者
Abe, S
机构
[1] Hitachi Research Laboratory, Hitachi, Ltd.
来源
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS | 1996年 / 26卷 / 01期
关键词
D O I
10.1109/3477.484454
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In our previous work we have clarified global convergence of the Hopfield neural network and showed, by computer simulations, improvement of solution quality by gradually decreasing the diagonal elements of the coefficient matrix. In this paper, to accelerate convergence of the Hopfield network, at each time step the integration step size is determined dynamically so that at least one component of a variable vector reaches the surface of the hypercube. The computer simulation for the traveling salesman problem and an LSI module placement problem shows that convergence is stabilized and accelerated compared to integration by a constant step size.
引用
收藏
页码:194 / 201
页数:8
相关论文
共 12 条
[1]   GLOBAL CONVERGENCE AND SUPPRESSION OF SPURIOUS STATES OF THE HOPFIELD NEURAL NETWORKS [J].
ABE, S .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-FUNDAMENTAL THEORY AND APPLICATIONS, 1993, 40 (04) :246-257
[2]  
ABE S, 1994, T IEEE CIRCUITS SY 2, V41
[3]  
AIYER SVB, 1991, CUEDFINFENGTR89 U CA
[4]  
BREUER MA, 1977, 14TH P DES AUT C, P284
[5]  
DATE H, 1989, VLD8945 IEICE
[6]  
DATE H, 1990, INT JOINT C NEURAL N, V3, P831
[7]   COMPETITIVE NEURAL ARCHITECTURE FOR HARDWARE SOLUTION TO THE ASSIGNMENT PROBLEM [J].
EBERHARDT, SP ;
DAUD, T ;
KERNS, DA ;
BROWN, TX ;
THAKOOR, AP .
NEURAL NETWORKS, 1991, 4 (04) :431-442
[8]   AN ANALYTICAL FRAMEWORK FOR OPTIMIZING NEURAL NETWORKS [J].
GEE, AH ;
AIYER, SVB ;
PRAGER, RW .
NEURAL NETWORKS, 1993, 6 (01) :79-97
[9]  
HOPFIELD JJ, 1985, BIOL CYBERN, V52, P141
[10]   OPTIMIZATION BY SIMULATED ANNEALING [J].
KIRKPATRICK, S ;
GELATT, CD ;
VECCHI, MP .
SCIENCE, 1983, 220 (4598) :671-680