Jump to content

Transfer entropy: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
طاها (talk | contribs)
m Attempt to fix the chapter issue: unsuccessful
m Minor fix: chapter => title.
Line 10: Line 10:
While it was originally defined for [[bivariate analysis]], transfer entropy has been extended to [[Multivariate analysis|multivariate]] forms, either conditioning on other potential source variables<ref>{{cite journal|last=Lizier|first=Joseph|author2=Prokopenko, Mikhail |author3=Zomaya, Albert |title=Local information transfer as a spatiotemporal filter for complex systems|url=http://dx.doi.org/10.1103/PhysRevE.77.026110|journal=Physical Review E|year=2008|volume=77|issue=2|pages=026110|doi=10.1103/PhysRevE.77.026110}}</ref> or considering transfer from a collection of sources,<ref name = Lizier2011>{{cite journal|last=Lizier|first=Joseph|author2=Heinzle, Jakob |author3=Horstmann, Annette |author4=Haynes, John-Dylan |author5= Prokopenko, Mikhail |title=Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity|url=http://dx.doi.org/10.1007/s10827-010-0271-2|journal=Journal of Computational Neuroscience|year=2011|volume=30|issue=1|pages=85–107|doi=10.1007/s10827-010-0271-2}}</ref> although these forms require more samples again.
While it was originally defined for [[bivariate analysis]], transfer entropy has been extended to [[Multivariate analysis|multivariate]] forms, either conditioning on other potential source variables<ref>{{cite journal|last=Lizier|first=Joseph|author2=Prokopenko, Mikhail |author3=Zomaya, Albert |title=Local information transfer as a spatiotemporal filter for complex systems|url=http://dx.doi.org/10.1103/PhysRevE.77.026110|journal=Physical Review E|year=2008|volume=77|issue=2|pages=026110|doi=10.1103/PhysRevE.77.026110}}</ref> or considering transfer from a collection of sources,<ref name = Lizier2011>{{cite journal|last=Lizier|first=Joseph|author2=Heinzle, Jakob |author3=Horstmann, Annette |author4=Haynes, John-Dylan |author5= Prokopenko, Mikhail |title=Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity|url=http://dx.doi.org/10.1007/s10827-010-0271-2|journal=Journal of Computational Neuroscience|year=2011|volume=30|issue=1|pages=85–107|doi=10.1007/s10827-010-0271-2}}</ref> although these forms require more samples again.


Transfer entropy has been used for estimation of [[functional connectivity]] of [[neurons]]<ref name=Lizier2011 /><ref>{{cite journal|last=Vicente|first=Raul|author2=Wibral, Michael |author3=Lindner, Michael |author4= Pipa, Gordon |title=Transfer entropy—a model-free measure of effective connectivity for the neurosciences |url=http://link.springer.com/article/10.1007%2Fs10827-010-0262-3 |journal=Journal of Computational Neuroscience|date=February 2011|volume=30|issue=1|pages=45–67|doi=10.1007/s10827-010-0262-3}}</ref> and [[social influence]] in [[social networks]].<ref name=Greg>{{cite conference |url=http://arxiv.org/abs/1110.2724 |chapter= Information transfer in social media|last1= Ver Steeg |first1= Greg|last2=Galstyan|first2= Aram |year= 2012|publisher= [[Association for Computing Machinery|ACM]]|booktitle= Proceedings of the 21st international conference on World Wide Web (WWW '12) |pages= 509–518 }}</ref>
Transfer entropy has been used for estimation of [[functional connectivity]] of [[neurons]]<ref name=Lizier2011 /><ref>{{cite journal|last=Vicente|first=Raul|author2=Wibral, Michael |author3=Lindner, Michael |author4= Pipa, Gordon |title=Transfer entropy—a model-free measure of effective connectivity for the neurosciences |url=http://link.springer.com/article/10.1007%2Fs10827-010-0262-3 |journal=Journal of Computational Neuroscience|date=February 2011|volume=30|issue=1|pages=45–67|doi=10.1007/s10827-010-0262-3}}</ref> and [[social influence]] in [[social networks]].<ref name=Greg>{{cite conference |url=http://arxiv.org/abs/1110.2724 |title= Information transfer in social media|last1= Ver Steeg |first1= Greg|last2=Galstyan|first2= Aram |year= 2012|publisher= [[Association for Computing Machinery|ACM]]|booktitle= Proceedings of the 21st international conference on World Wide Web (WWW '12) |pages= 509–518 }}</ref>


== See also ==
== See also ==

Revision as of 14:23, 2 September 2015

Transfer entropy is a non-parametric statistic measuring the amount of directed (time-asymmetric) transfer of information between two random processes.[1][2][3] Transfer entropy from a process X to another process Y is the amount of uncertainty reduced in future values of Y by knowing the past values of X given past values of Y. More specifically, if and for denote two random processes and the amount of information is measured using Shannon's entropy, the transfer entropy can be written as:

where H(X) is Shannon entropy of X. The above definition of transfer entropy has been extended by other types of entropy measures such as Rényi entropy.[3]

Transfer entropy is conditional mutual information,[4][5] with the history of the influenced variable in the condition. Transfer entropy reduces to Granger causality for vector auto-regressive processes.[6] Hence, it is advantageous when the model assumption of Granger causality doesn't hold, for example, analysis of non-linear signals.[7][8] However, it usually requires more samples for accurate estimation.[9] While it was originally defined for bivariate analysis, transfer entropy has been extended to multivariate forms, either conditioning on other potential source variables[10] or considering transfer from a collection of sources,[11] although these forms require more samples again.

Transfer entropy has been used for estimation of functional connectivity of neurons[11][12] and social influence in social networks.[7]

See also

References

  1. ^ Schreiber, Thomas (1 July 2000). "Measuring Information Transfer". Physical Review Letters. 85 (2): 461–464. doi:10.1103/PhysRevLett.85.461.
  2. ^ Seth, Anil (2007). "Granger causality". Scholarpedia. doi:10.4249/scholarpedia.1667.{{cite encyclopedia}}: CS1 maint: unflagged free DOI (link)
  3. ^ a b Hlaváčková-Schindler, Katerina; PALUS, M; VEJMELKA, M; BHATTACHARYA, J (1 March 2007). "Causality detection based on information-theoretic approaches in time series analysis". Physics Reports. 441 (1): 1–46. doi:10.1016/j.physrep.2006.12.004.
  4. ^ Wyner, A. D. (1978). "A definition of conditional mutual information for arbitrary ensembles". Information and Control. 38 (1): 51–59. doi:10.1016/s0019-9958(78)90026-8.
  5. ^ Dobrushin, R. L. (1959). "General formulation of Shannon's main theorem in information theory". Ushepi Mat. Nauk. 14: 3–104.
  6. ^ Barnett, Lionel (1 December 2009). "Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables". Physical Review Letters. 103 (23). doi:10.1103/PhysRevLett.103.238701.
  7. ^ a b Ver Steeg, Greg; Galstyan, Aram (2012). "Information transfer in social media". Proceedings of the 21st international conference on World Wide Web (WWW '12). ACM. pp. 509–518. {{cite conference}}: Unknown parameter |booktitle= ignored (|book-title= suggested) (help)
  8. ^ LUNGARELLA, M.; ISHIGURO, K.; KUNIYOSHI, Y.; OTSU, N. (1 March 2007). "METHODS FOR QUANTIFYING THE CAUSAL STRUCTURE OF BIVARIATE TIME SERIES". International Journal of Bifurcation and Chaos. 17 (03): 903–921. doi:10.1142/S0218127407017628.
  9. ^ Pereda, E; Quiroga, RQ; Bhattacharya, J (Sep–Oct 2005). "Nonlinear multivariate analysis of neurophysiological signals". Progress in neurobiology. 77 (1–2): 1–37. doi:10.1016/j.pneurobio.2005.10.003. PMID 16289760.
  10. ^ Lizier, Joseph; Prokopenko, Mikhail; Zomaya, Albert (2008). "Local information transfer as a spatiotemporal filter for complex systems". Physical Review E. 77 (2): 026110. doi:10.1103/PhysRevE.77.026110.
  11. ^ a b Lizier, Joseph; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail (2011). "Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity". Journal of Computational Neuroscience. 30 (1): 85–107. doi:10.1007/s10827-010-0271-2.
  12. ^ Vicente, Raul; Wibral, Michael; Lindner, Michael; Pipa, Gordon (February 2011). "Transfer entropy—a model-free measure of effective connectivity for the neurosciences". Journal of Computational Neuroscience. 30 (1): 45–67. doi:10.1007/s10827-010-0262-3.

External links