Information diagram

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y).

An information diagram is a type of Venn diagram used in information theory to illustrate relationships among Shannon's basic measures of information: entropy, joint entropy, conditional entropy and mutual information.[1][2] Information diagrams are a useful pedagogical tool for teaching and learning about these basic measures of information, but using such diagrams carries some non-trivial implications. For example, Shannon's entropy in the context of an information diagram must be taken as a signed measure. (See the article Information theory and measure theory for more information.)

Random variables X, Y, and Z are said to form a Markov chain if Z and X are independent given Y. Thus, Y contains all the information of X that is relevant to Z. And, thus, knowing X does not add anything to one's knowledge of Z when Y is given.

References[edit]

  1. ^ Fazlollah Reza. An Introduction to Information Theory. New York: McGraw-Hill 1961. New York: Dover 1994. ISBN 0-486-68210-2
  2. ^ R. W. Yeung, A First Course in Information Theory. Norwell, MA/New York: Kluwer/Plenum, 2002.