Efficient supervised learning in networks with binary synapses

被引:69
作者
Baldassi, Carlo
Braunstein, Alfredo
Brunel, Nicolas
Zecchina, Riccardo
机构
[1] Inst Sci Interchange Fdn, I-10133 Turin, Italy
[2] Politecn Torino, I-10129 Turin, Italy
[3] Univ Paris 05, Lab Neurophys & Physiol, UMR 8119, CNRS, F-75270 Paris 06, France
[4] Abdus Salaam Int Ctr Theoret Phys, I-34100 Trieste, Italy
关键词
belief propagation; computational neuroscience; perceptron; synaptic plasticity;
D O I
10.1073/pnas.0700324104
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Recent experimental studies indicate that synaptic changes induced by neuronal activity are discrete jumps between a small number of stable states. Learning in systems with discrete synapses is known to be a computationally hard problem. Here, we study a neurobiologically plausible on-line learning algorithm that derives from belief propagation algorithms. We show that it performs remarkably well in a model neuron with binary synapses, and a finite number of "hidden" states per synapse, that has to learn a random classification task. Such a system is able to learn a number of associations close to the theoretical limit in time that is sublinear in system size. This is to our knowledge the first on-line algorithm that is able to achieve efficiently a finite number of patterns learned per binary synapse. Furthermore, we show that performance is optimal for a finite number of hidden states that becomes very small for sparse coding. The algorithm is similar to the standard "perceptron" learning algorithm, with an additional rule for synaptic transitions that occur only if a currently presented pattern is "barely correct." In this case, the synaptic changes are metaplastic only (change in hidden states and not in actual synaptic state), stabilizing the synapse in its current state. Finally, we show that a system with two visible states and K hidden states is much more robust to noise than a system with K visible states. We suggest that this rule is sufficiently simple to be easily implemented by neurobiological systems or in hardware.
引用
收藏
页码:11079 / 11084
页数:6
相关论文
共 27 条
[1]  
AMALDI A, 1991, ARTIFICAL NEURAL NET, V1, P55
[2]   THE INTERACTION SPACE OF NEURAL NETWORKS WITH SIGN-CONSTRAINED SYNAPSES [J].
AMIT, DJ ;
CAMPBELL, C ;
WONG, KYM .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1989, 22 (21) :4687-4693
[3]   CONSTRAINTS ON LEARNING IN DYNAMIC SYNAPSES [J].
AMIT, DJ ;
FUSI, S .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1992, 3 (04) :443-464
[4]   LEARNING IN NEURAL NETWORKS WITH MATERIAL SYNAPSES [J].
AMIT, DJ ;
FUSI, S .
NEURAL COMPUTATION, 1994, 6 (05) :957-982
[5]   Emergent properties of networks of biological signaling pathways [J].
Bhalla, US ;
Iyengar, R .
SCIENCE, 1999, 283 (5400) :381-387
[6]   TRAINING A 3-NODE NEURAL NETWORK IS NP-COMPLETE [J].
BLUM, AL ;
RIVEST, RL .
NEURAL NETWORKS, 1992, 5 (01) :117-127
[7]   Learning by message passing in networks of discrete synapses [J].
Braunstein, A ;
Zecchina, R .
PHYSICAL REVIEW LETTERS, 2006, 96 (03)
[8]   Polynomial iterative algorithms for coloring and analyzing random graphs [J].
Braunstein, A ;
Mulet, R ;
Pagnani, A ;
Weigt, M ;
Zecchina, R .
PHYSICAL REVIEW E, 2003, 68 (03) :15
[9]   Survey propagation:: An algorithm for satisfiability [J].
Braunstein, A ;
Mézard, M ;
Zecchina, R .
RANDOM STRUCTURES & ALGORITHMS, 2005, 27 (02) :201-226
[10]   Slow stochastic Hebbian learning of classes of stimuli in a recurrent neural network [J].
Brunel, N ;
Carusi, F ;
Fusi, S .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1998, 9 (01) :123-152