Jump to content

Transfer entropy

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by BiObserver (talk | contribs) at 23:47, 4 June 2013 (→‎See also: + Mutual Information). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Transfer entropy is a non-parametric statistic measuring the amount of directed (time-asymmetric) transfer of information between two random processes.[1][2][3] Transfer entropy from a process X to another process Y is the amount of uncertainty reduced in future values of Y by knowing the past values of X given past values of Y. More specifically, if and for denote two random processes and the amount of information is measured using Shannon's entropy, the transfer entropy can be written as:

where H(X) is Shannon entropy of X. The above definition of transfer entropy has been extended by other types of entropy measures such as Rényi entropy.[3]

Transfer entropy reduces to Granger causality for vector auto-regressive processes.[4] Hence, it is advantageous when the model assumption of Granger causality doesn't hold, for e.g. analysis of non-linear signals.[5][6] However, it is currently limited to the bivariate analysis and usually requires more samples for accurate estimation.[7]

Transfer entropy has been used for estimation of functional connectivity of neurons[8] and social influence in social networks.[5]

See also

References

  1. ^ Schreiber, Thomas (1 July 2000). "Measuring Information Transfer". Physical Review Letters. 85 (2): 461–464. doi:10.1103/PhysRevLett.85.461.
  2. ^ Seth, Anil (2007). "Granger causality". Scholarpedia. doi:10.4249/scholarpedia.1667.{{cite encyclopedia}}: CS1 maint: unflagged free DOI (link)
  3. ^ a b Hlaváčková-Schindler, Katerina (1 March 2007). "Causality detection based on information-theoretic approaches in time series analysis". Physics Reports. 441 (1): 1–46. doi:10.1016/j.physrep.2006.12.004. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  4. ^ Barnett, Lionel (1 December 2009). "Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables". Physical Review Letters. 103 (23). doi:10.1103/PhysRevLett.103.238701.
  5. ^ a b Ver Steeg, Greg; Galstyan, Aram (2012). "Information transfer in social media". Proceedings of the 21st international conference on World Wide Web (WWW '12). ACM. pp. 509–518. {{cite conference}}: Unknown parameter |booktitle= ignored (|book-title= suggested) (help)
  6. ^ LUNGARELLA, M. (1 March 2007). "METHODS FOR QUANTIFYING THE CAUSAL STRUCTURE OF BIVARIATE TIME SERIES". International Journal of Bifurcation and Chaos. 17 (03): 903–921. doi:10.1142/S0218127407017628. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)
  7. ^ Pereda, E (2005 Sep-Oct). "Nonlinear multivariate analysis of neurophysiological signals". Progress in neurobiology. 77 (1–2): 1–37. PMID 16289760. {{cite journal}}: Check date values in: |date= (help); Unknown parameter |coauthors= ignored (|author= suggested) (help)
  8. ^ Vicente, Raul (2011). "Transfer entropy—a model-free measure of effective connectivity for the neurosciences". Journal of Computational Neuroscience. 30 (1): 45–67. doi:10.1007/s10827-010-0262-3. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help); Unknown parameter |month= ignored (help)

External links