Including Hints in Training Neural Nets

被引:14
作者
Al-Mashouq, Khalid A. [1 ]
Reed, Irving S. [1 ]
机构
[1] Univ Southern Calif, Dept Elect Engn, Los Angeles, CA 90007 USA
关键词
D O I
10.1162/neco.1991.3.3.418
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The aim of a neural net is to partition the data space into near optimal decision regions. Learning such a partitioning solely from examples has proven to be a very hard problem (Blum and Rivest 1988; Judd 1988). To remedy this, we use the idea of supplying hints to the network - as discussed by Abu-Mostafa (1990). Hints reduce the solution space, and as a consequence speed up the learning process. The minimum Hamming distance between the patterns serves as the hint. Next, it is shown how to learn such a hint and how to incorporate it into the learning algorithm. Modifications in the net structure and its operation are suggested, which allow for a better generalization. The sensitivity to errors in such a hint is studied through some simulations.
引用
收藏
页码:418 / 427
页数:10
相关论文
共 5 条
[1]  
Al-Mashouq K., 1990, 7 INT C SYST ENG LAS
[2]  
BLUM A, 1988, P 1 ANN WORKSH COMP, P00009
[3]  
Judd S., 1988, Journal of Complexity, V4, P177, DOI 10.1016/0885-064X(88)90019-2
[4]   THE CAPACITY OF THE HOPFIELD ASSOCIATIVE MEMORY [J].
MCELIECE, RJ ;
POSNER, EC ;
RODEMICH, ER ;
VENKATESH, SS .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1987, 33 (04) :461-482
[5]  
Rumelhart DE, 1986, ENCY DATABASE SYST, P45