Can threshold networks be trained directly?

被引:191
作者
Huang, GB [1 ]
Zhu, QY [1 ]
Mao, KZ [1 ]
Siew, CK [1 ]
Saratchandran, P [1 ]
Sundararajan, N [1 ]
机构
[1] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
关键词
extreme learning machine (ELM); gradient descent method; threshold neural networks;
D O I
10.1109/TCSII.2005.857540
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Neural networks with threshold activation functions are highly desirable because of the ease of hardware implementation, However, the popular gradient-based learning algorithms cannot be directly used to train these networks as the threshold functions are nondifferentiable. Methods available in the literature mainly focus on approximating the threshold activation functions by using sigmoid functions. In this paper, we show theoretically that the recently developed extreme learning machine (ELM) algorithm can be used to train the neural networks with threshold functions directly instead of approximating them with sigmoid functions. Experimental results based on real-world benchmark regression problems demonstrate that the generalization performance obtained by ELM is better than other algorithms used in threshold networks. Also, the ELM method does not need control variables (manually tuned parameters) and is much faster.
引用
收藏
页码:187 / 191
页数:5
相关论文
共 13 条