An information diagram is a type of Venn diagram used in information theory to illustrate relationships among Shannon's basic measures of information: entropy, joint entropy, conditional entropy and mutual information. Information diagrams are a useful pedagogical tool for teaching and learning about these basic measures of information, but using such diagrams carries some non-trivial implications. For example, Shannon's entropy in the context of an information diagram must be taken as a signed measure. (See the article Information theory and measure theory for more information.)
Random variables X, Y, and Z are said to form a Markov chain if Z and X are independent given Y. Thus, Y contains all the information of X that is relevant to Z. And, thus, knowing X does not add anything to one's knowledge of Z when Y is given.
- Fazlollah Reza. An Introduction to Information Theory. New York: McGraw-Hill 1961. New York: Dover 1994. ISBN 0-486-68210-2
- R. W. Yeung, A First Course in Information Theory. Norwell, MA/New York: Kluwer/Plenum, 2002.
|This set theory-related article is a stub. You can help Wikipedia by expanding it.|
|P ≟ NP||This theoretical computer science–related article is a stub. You can help Wikipedia by expanding it.|