THE GEOMETRICAL LEARNING OF BINARY NEURAL NETWORKS

被引:68
作者
KIM, JH [1 ]
PARK, SK [1 ]
机构
[1] TENNESSEE TECHNOL UNIV,DEPT ELECT ENGN,COOKEVILLE,TN 38505
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1995年 / 6卷 / 01期
基金
美国国家科学基金会;
关键词
D O I
10.1109/72.363432
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, the learning algorithm called expand-and-truncate learning (ETL) is proposed to train multilayer binary neural networks (BNN) with guaranteed convergence for any binary-to-binary mapping. The most significant contribution of this paper is the development of learning algorithm for three-layer BNN which guarantees the convergence, automatically determining a required number of neurons in the hidden layer. Furthermore, the learning speed of the proposed ETL algorithm is much faster than that of back propagation learning algorithm in a binary field. Neurons in the proposed BNN employ a hard-limiter activation function, with only integer weights and integer thresholds. Therefore, this will greatly facilitate actual hardware implementation of the proposed BNN using currently available digital VLSI technology.
引用
收藏
页码:237 / 247
页数:11
相关论文
共 11 条
[1]  
BARTLETT PL, 1992, IEEE T NEURAL NE MAR, P202
[2]  
BRADY ML, 1989, IEEE T CIRCUITS SYST, P665
[3]  
Caudill M, 1990, NATURALLY INTELLIGEN
[4]  
COTTER NE, 1990, IEEE T NEURAL NE DEC
[5]  
FUKUNAGA K, 1990, INTRO STATISTICAL PA
[6]  
GRAY DL, 1992, IEEE T NEURAL NE MAR, P176
[7]  
Minsky M., 1969, CAMBRIDGE TIASS HIT
[8]  
Muroga S., 1971, THRESHOLD LOGIC ITS
[9]  
OH S, 1991, IEEE T NEURAL NE JAN, P160
[10]  
PARK DC, 1991, IEEE T NEURAL NE MAY, P334