Mixing floating- and fixed-point formats for neural network learning on neuroprocessors

被引:9
作者
Anguita, D [1 ]
Gomes, BA [1 ]
机构
[1] INT COMP SCI INST,BERKELEY,CA 94704
来源
MICROPROCESSING AND MICROPROGRAMMING | 1996年 / 41卷 / 10期
关键词
neural networks; neuroprocessors; fixed-point format;
D O I
10.1016/0165-6074(96)00012-9
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We examine the efficient implementation of back-propagation (BP) type algorithms on TO [3], a vector processor with a fixed-point engine, designed for neural network simulation, Using Matrix Back Propagation (MBP) [2] we achieve an asymptotically optimal performance on TO (about 0.8 GOPS) for both forward and backward phases, which is not possible with the standard on-line BP algorithm. We use a mixture of fixed- and floating-point operations in order to guarantee both high efficiency and fast convergence. Though the most expensive computations are implemented in fixed-point, we achieve a rate of convergence that is comparable to the floating-point version, The time taken for conversion between fixed- and floating-point is also shown to be reasonably low.
引用
收藏
页码:757 / 769
页数:13
相关论文
共 28 条
  • [1] ALIPPI C, 1994, INT JOINT C NEUR NET, P1873
  • [2] AN EFFICIENT IMPLEMENTATION OF BP ON RISC-BASED WORKSTATIONS
    ANGUITA, D
    PARODI, G
    ZUNINO, R
    [J]. NEUROCOMPUTING, 1994, 6 (01) : 57 - 65
  • [3] ASANOVIC K, 1993, INT J NEURAL SYST, V4, P317, DOI 10.1142/S0129065793000250
  • [4] Asanovic K., 1991, 2ND P INT C MICR NEU, P9
  • [5] ASANOVIC K, 1995, HOT CHIPS 7 S STANF
  • [6] BOCHEV V, 1993, IEEE T SIGNAL PROCES, V41
  • [7] CONTINUOUS SPEECH RECOGNITION BY CONNECTIONIST STATISTICAL-METHODS
    BOURLARD, H
    MORGAN, N
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1993, 4 (06): : 893 - 909
  • [8] CARRATO S, 1991, 1991 P INT C DIG SIG, P526
  • [9] CAVIGLIA DD, 1990, P IJCNN 90 SAN DIEG, P631
  • [10] CORANA A, 1989, HIGH PERFORMANCE COMPUTING /, P181