LEARNING TIME-SERIES EVOLUTION BY UNSUPERVISED EXTRACTION OF CORRELATIONS

被引:11
作者
DECO, G
SCHURMANN, B
机构
[1] Siemens AG, Corporate Research and Development, ZFE ST SN 41, 81739 Munich
来源
PHYSICAL REVIEW E | 1995年 / 51卷 / 03期
关键词
D O I
10.1103/PhysRevE.51.1780
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
We focus on the problem of modeling time series by learning statistical correlations between the past and present elements of the series in an unsupervised fashion. This kind of correlation is, in general, nonlinear, especially in the chaotic domain. Therefore the learning algorithm should be able to extract statistical correlations, i.e., higher-order correlations between the elements of the time signal. This problem can be viewed as a special case of factorial learning. Factorial learning may be formulated as an unsupervised redundancy reduction between the output components of a transformation that conserves the transmitted information. An information-theoretic-based architecture and learning paradigm are introduced. The neural architecture has only one layer and a triangular structure in order to transform elements by observing only the past and to conserve the volume. In this fashion, a transformation that guarantees transmission of information without loss is formulated. The learning rule decorrelates the output components of the network. Two methods are used: higher-order decorrelation by explicit evaluation of higher-order cumulants of the output distributions, and minimization of the sum of entropies of each output component in order to minimize the mutual information between them, assuming that the entropies have an upper bound given by Gibbs second theorem. After decorrelation between the output components, the correlation between the elements of the time series can be extracted by analyzing the trained neural architecture. As a consequence, we are able to model chaotic and nonchaotic time series. Furthermore, one critical point in modeling time series is the determination of the dimension of the embedding vector used, i.e., the number of components of the past that are needed to predict the future. With this method we can detect the embedding dimension by extracting the influence of the past on the future, i.e., the correlation of remote past and future. Optimal embedding dimensions are obtained for the Hénon map and the Mackey-Glass series. When noisy data corrupted by colored noise are used, a model is still possible. The noise will then be decorrelated by the network. In the case of modeling a chemical reaction, the most natural architecture that conserves the volume is a symplectic network which describes a system that conserves the entropy and therefore the transmitted information. © 1995 The American Physical Society.
引用
收藏
页码:1780 / 1790
页数:11
相关论文
共 33 条
[1]   PREDICTION IN CHAOTIC NONLINEAR-SYSTEMS - METHODS FOR TIME-SERIES WITH BROAD-BAND FOURIER SPECTRA [J].
ABARBANEL, HDI ;
BROWN, R ;
KADTKE, JB .
PHYSICAL REVIEW A, 1990, 41 (04) :1782-1807
[2]   PREDICTION AND SYSTEM-IDENTIFICATION IN CHAOTIC NONLINEAR-SYSTEMS - TIME-SERIES WITH BROAD-BAND SPECTRA [J].
ABARBANEL, HDI ;
BROWN, R ;
KADTKE, JB .
PHYSICS LETTERS A, 1989, 138 (08) :401-408
[3]  
ABRAHAM R, 1978, F MECHANICS
[4]   USING NEURAL NETS TO LOOK FOR CHAOS [J].
ALBANO, AM ;
PASSAMANTE, A ;
HEDIGER, T ;
FARRELL, ME .
PHYSICA D, 1992, 58 (1-4) :1-9
[5]   WHAT DOES THE RETINA KNOW ABOUT NATURAL SCENES [J].
ATICK, JJ ;
REDLICH, AN .
NEURAL COMPUTATION, 1992, 4 (02) :196-210
[6]   Towards a Theory of Early Visual Processing [J].
Atick, Joseph J. ;
Redlich, A. Norman .
NEURAL COMPUTATION, 1990, 2 (03) :308-320
[7]  
BARLOW H, 1959, NATIONAL PHYSICAL LA, V10
[8]   Unsupervised Learning [J].
Barlow, H. B. .
NEURAL COMPUTATION, 1989, 1 (03) :295-311
[9]  
Beck C., 1993, CAMBRIDGE NONLINEAR
[10]   OPTIMAL DELAY TIME AND EMBEDDING DIMENSION FOR DELAY-TIME COORDINATES BY ANALYSIS OF THE GLOBAL STATIC AND LOCAL DYNAMIC BEHAVIOR OF STRANGE ATTRACTORS [J].
BUZUG, T ;
PFISTER, G .
PHYSICAL REVIEW A, 1992, 45 (10) :7073-7084