Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems

被引:140
作者
Barrett, Adam B. [1 ,2 ]
机构
[1] Univ Sussex, Sackler Ctr Consciousness Sci, Dept Informat, Brighton BN1 9QJ, E Sussex, England
[2] Univ Milan, Dept Clin Sci, I-20157 Milan, Italy
来源
PHYSICAL REVIEW E | 2015年 / 91卷 / 05期
基金
英国工程与自然科学研究理事会;
关键词
Information analysis - Information dissemination - Gaussian distribution - Continuous time systems;
D O I
10.1103/PhysRevE.91.052802
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 [等离子体物理]; 070301 [无机化学];
摘要
To fully characterize the information that two source variables carry about a third target variable, one must decompose the total information into redundant, unique, and synergistic components, i.e., obtain a partial information decomposition (PID). However, Shannon's theory of information does not provide formulas to fully determine these quantities. Several recent studies have begun addressing this. Some possible definitions for PID quantities have been proposed and some analyses have been carried out on systems composed of discrete variables. Here we present an in-depth analysis of PIDs on Gaussian systems, both static and dynamical. We show that, for a broad class of Gaussian systems, previously proposed PID formulas imply that (i) redundancy reduces to the minimum information provided by either source variable and hence is independent of correlation between sources, and (ii) synergy is the extra information contributed by the weaker source when the stronger source is known and can either increase or decrease with correlation between sources. We find that Gaussian systems frequently exhibit net synergy, i.e., the information carried jointly by both sources is greater than the sum of information carried by each source individually. Drawing from several explicit examples, we discuss the implications of these findings for measures of information transfer and information-based measures of complexity, both generally and within a neuroscience setting. Importantly, by providing independent formulas for synergy and redundancy applicable to continuous time-series data, we provide an approach to characterizing and quantifying information sharing amongst complex system variables.
引用
收藏
页数:14
相关论文
共 39 条
[1]
Redundant variables and Granger causality [J].
Angelini, L. ;
de Tommaso, M. ;
Marinazzo, D. ;
Nitti, L. ;
Pellicoro, M. ;
Stramaglia, S. .
PHYSICAL REVIEW E, 2010, 81 (03)
[2]
[Anonymous], ARXIV10042515
[3]
Integrated information in discrete dynamical systems: Motivation and theoretical framework [J].
Balduzzi, David ;
Tononi, Giulio .
PLOS COMPUTATIONAL BIOLOGY, 2008, 4 (06)
[4]
Information Flow in a Kinetic Ising Model Peaks in the Disordered Phase [J].
Barnett, Lionel ;
Lizier, Joseph T. ;
Harre, Michael ;
Seth, Anil K. ;
Bossomaier, Terry .
PHYSICAL REVIEW LETTERS, 2013, 111 (17)
[5]
Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables [J].
Barnett, Lionel ;
Barrett, Adam B. ;
Seth, Anil K. .
PHYSICAL REVIEW LETTERS, 2009, 103 (23)
[6]
Practical Measures of Integrated Information for Time-Series Data [J].
Barrett, Adam B. ;
Seth, Anil K. .
PLOS COMPUTATIONAL BIOLOGY, 2011, 7 (01)
[7]
Multivariate Granger causality and generalized variance [J].
Barrett, Adam B. ;
Barnett, Lionel ;
Seth, Anil K. .
PHYSICAL REVIEW E, 2010, 81 (04)
[8]
Bertschinger N., 2013, P EUROPEAN C COMPLEX, P251, DOI DOI 10.1007/978-3-319-00395-5_35
[9]
Quantifying Unique Information [J].
Bertschinger, Nils ;
Rauh, Johannes ;
Olbrich, Eckehard ;
Jost, Juergen ;
Ay, Nihat .
ENTROPY, 2014, 16 (04) :2161-2183
[10]
Functional structure of cortical neuronal networks grown in vitro [J].
Bettencourt, Luis M. A. ;
Stephens, Greg J. ;
Ham, Michael I. ;
Gross, Guenter W. .
PHYSICAL REVIEW E, 2007, 75 (02)