A recurrent log-linearized Gaussian mixture network

被引:34
作者
Tsuji, T [1 ]
Bu, N
Fukuda, O
Kaneko, M
机构
[1] Hiroshima Univ, Dept Artificial Complex Syst Engn, Higashihiroshima 7398527, Japan
[2] Natl Inst Adv Ind Sci & Technol, Tsukuba, Ibaraki 3058564, Japan
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2003年 / 14卷 / 02期
关键词
EEP; Gaussian mixture model; hidden Markov model (MM); log-linearized model; neural networks (NNs); pattern classification; recurrent neural networks (RNNs);
D O I
10.1109/TNN.2003.809403
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Context in time series is one of the most useful and interesting characteristics for machine learning. In some cases, the dynamic characteristic would be the only basis for achieving a possible classification. A novel neural network, which is named "a recurrent log-linearized Gaussian mixture network (R-LLGMN)," is proposed in this paper for classification of time series. The structure of this network is based on a hidden Markov model (HMM), which has been well developed in the area of speech recognition. R-LLGMN can as well be interpreted as an extension of a probabilistic neural network using a log-linearized Gaussian mixture model, in which recurrent connections have been incorporated to make temporal information in use. Some simulation experiments are, carried out to compare R-LLGMN with the traditional estimator of HMM as classifiers, and finally, pattern classification experiments for EEG signals are conducted. It is indicated from these experiments that R-LLGMN can successfully classify not only artificial data but real biological data such as EEG signals.
引用
收藏
页码:304 / 316
页数:13
相关论文
共 40 条
[1]   Dynamical recurrent neural networks towards prediction and modeling of dynamical systems [J].
Aussem, A .
NEUROCOMPUTING, 1999, 28 :207-232
[2]   STATISTICAL INFERENCE FOR PROBABILISTIC FUNCTIONS OF FINITE STATE MARKOV CHAINS [J].
BAUM, LE ;
PETRIE, T .
ANNALS OF MATHEMATICAL STATISTICS, 1966, 37 (06) :1554-&
[3]   A MAXIMIZATION TECHNIQUE OCCURRING IN STATISTICAL ANALYSIS OF PROBABILISTIC FUNCTIONS OF MARKOV CHAINS [J].
BAUM, LE ;
PETRIE, T ;
SOULES, G ;
WEISS, N .
ANNALS OF MATHEMATICAL STATISTICS, 1970, 41 (01) :164-&
[4]   HOPFIELD NET GENERATION, ENCODING AND CLASSIFICATION OF TEMPORAL TRAJECTORIES [J].
BERSINI, H ;
SAERENS, M ;
SOTELINO, LG .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (06) :945-953
[5]  
Bishop C. M., 1995, NEURAL NETWORKS PATT
[6]   Handwritten character recognition using low resolutions [J].
Bourbakis, NG ;
Koutsougeras, C ;
Jameel, A .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 1999, 12 (02) :139-147
[7]  
Bourlard H., 1989, Advances in Neural Information Processing Systems, VI, P502
[8]  
Bridle J.S., 1990, NEUROCOMPUTING, P227, DOI DOI 10.1007/978-3-642-76153-9_28
[10]   MODEL-BASED NEURAL NETWORKS [J].
CAELLI, TM ;
SQUIRE, DM ;
WILD, TPJ .
NEURAL NETWORKS, 1993, 6 (05) :613-625