Learning by Asymmetric Parallel Boltzmann Machines

被引:5
作者
Apolloni, Bruno [1 ]
de Falco, Diego [2 ]
机构
[1] Univ Milan, Dipartimento Sci Informaz, I-20133 Milan, Italy
[2] Politecn Milan, Dipartmento Matemat, I-20133 Milan, Italy
关键词
D O I
10.1162/neco.1991.3.3.402
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the Little, Shaw, Vasudevan model as a parallel asymmetric Boltzmann machine, in the sense that we extend to this model the entropic learning rule first studied by Ackley, Hinton, and Sejnowski in the case of a sequentially activated network with symmetric synaptic matrix. The resulting Hebbian learning rule for the parallel asymmetric model draws the signal for the updating of synaptic weights from time averages of the discrepancy between expected and actual transitions along the past history of the network. As we work without the hypothesis of symmetry of the weights, we can include in our analysis also feedforward networks, for which the entropic learning rule turns out to be complementary to the error backpropagation rule, in that it "rewards the correct behavior" instead of "penalizing the wrong answers."
引用
收藏
页码:402 / 408
页数:7
相关论文
共 10 条
[1]  
ACKLEY DH, 1985, COGNITIVE SCI, V9, P147
[2]  
Apolloni B., 1991, P NEURONET IN PRESS, V90
[3]  
Bertoni A, 1989, P NEURONIMES 89, V361
[4]   A GENERALIZED CONVERGENCE THEOREM FOR NEURAL NETWORKS [J].
BRUCK, J ;
GOODMAN, JW .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1988, 34 (05) :1089-1092
[5]   CONNECTIONIST LEARNING PROCEDURES [J].
HINTON, GE .
ARTIFICIAL INTELLIGENCE, 1989, 40 (1-3) :185-234
[6]  
LITTLE W A, 1974, Mathematical Biosciences, V19, P101, DOI 10.1016/0025-5564(74)90031-5
[7]   ANALYTIC STUDY OF MEMORY STORAGE CAPACITY OF A NEURAL NETWORK [J].
LITTLE, WA ;
SHAW, GL .
MATHEMATICAL BIOSCIENCES, 1978, 39 (3-4) :281-290
[8]  
Pisano R., 1991, THESIS
[9]  
Rumelhart David E., 1987, LEARNING INTERNAL RE, P318
[10]  
SHAW G L, 1974, Mathematical Biosciences, V21, P207, DOI 10.1016/0025-5564(74)90015-7