Temperature-constrained cascade correlation networks

被引:20
作者
Harrington, PD [1 ]
机构
[1] Ohio Univ, Dept Chem, Clippinger Labs, Ctr Intelligent Chem Instrumentat, Athens, OH 45701 USA
关键词
D O I
10.1021/ac970851y
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
A novel neural network has been devised that combines the advantages of cascade correlation and computational temperature constraints. The combination of advantages yields a nonlinear calibration method that is easier to use, stable, and faster than back-propagation networks. Cascade correlation networks adjust only a single unit at a time, so they train very rapidly when compared to backpropagation networks. Cascade correlation networks determine their topology during training, In addition, the hidden units are not readjusted once they have been trained, so these networks are capable of incremental learning and caching. With the cascade architecture, temperature may be optimized for each hidden unit. Computational temperature is a parameter that controls the fuzziness of a hidden unit's output. The magnitude of the change in covariance with respect to temperature is maximized. This criterion avoids local minima, forces the hidden units to model larger variances in the data, and generates hidden units that furnish fuzzy logic. As a result, models built using temperature-constrained cascade correlation networks are better at interpolation or generalization of the design points, These properties are demonstrated for exemplary linear interpolations, a nonlinear interpolation, and chemical data sets for which the numbers of chlorine atoms in polychlorinated biphenyl molecules are predicted from mass spectra.
引用
收藏
页码:1297 / 1306
页数:10
相关论文
共 25 条
[1]   Artificial neural network processing of stripping analysis responses for identifying and quantifying heavy metals in the presence of intermetallic compound formation [J].
Chan, H ;
Butler, A ;
Falck, DM ;
Freund, MS .
ANALYTICAL CHEMISTRY, 1997, 69 (13) :2373-2378
[2]   COMPUTER METHODS IN ANALYTICAL MASS SPECTROMETRY EMPIRICAL IDENTIFICATION OF MOLECULAR CLASS [J].
CRAWFORD, LR ;
MORRISON, JD .
ANALYTICAL CHEMISTRY, 1968, 40 (10) :1469-&
[3]  
Fahlman S. E., 1991, CMUCS90100, P1
[4]  
FAHLMAN SE, 1988, EMPIRICAL STUDY LEAR, P1
[5]   PARTIAL LEAST-SQUARES REGRESSION - A TUTORIAL [J].
GELADI, P ;
KOWALSKI, BR .
ANALYTICA CHIMICA ACTA, 1986, 185 :1-17
[6]   Correction of mass spectral drift using artificial neural networks [J].
Goodacre, R ;
Kell, DB .
ANALYTICAL CHEMISTRY, 1996, 68 (02) :271-280
[7]   FUZZY MULTIVARIATE RULE-BUILDING EXPERT SYSTEMS - MINIMAL NEURAL NETWORKS [J].
HARRINGTON, PB .
JOURNAL OF CHEMOMETRICS, 1991, 5 (05) :467-486
[8]   MINIMAL NEURAL NETWORKS - DIFFERENTIATION OF CLASSIFICATION ENTROPY [J].
HARRINGTON, PD .
CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 1993, 19 (02) :143-154
[9]   TEMPERATURE-CONSTRAINED BACKPROPAGATION NEURAL NETWORKS [J].
HARRINGTON, PD .
ANALYTICAL CHEMISTRY, 1994, 66 (06) :802-807
[10]  
HATVRIK S, 1996, J CHEM INF COMP SCI, V36, P992