A misleadingVenn diagram showing additive, and subtractive relationships between various information measures associated with correlated variables X and Y. The area contained by both circles is the joint entropy H(X,Y). The circle on the left (red and violet) is the individual entropy H(X), with the red being the conditional entropy H(X|Y). The circle on the right (blue and violet) is H(Y), with the blue being H(Y|X). The violet is the mutual information I(X;Y).
The joint entropy of a set of variables is greater than or equal to the maximum of all of the individual entropies of the variables in the set.
Less than or equal to the sum of individual entropies
The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables in the set. This is an example of subadditivity. This inequality is an equality if and only if and are statistically independent.: 30
The above definition is for discrete random variables and just as valid in the case of continuous random variables. The continuous version of discrete joint entropy is called joint differential (or continuous) entropy. Let and be a continuous random variables with a joint probability density function. The differential joint entropy is defined as: 249
For more than two continuous random variables the definition is generalized to:
The integral is taken over the support of . It is possible that the integral does not exist in which case we say that the differential entropy is not defined.
^Theresa M. Korn; Korn, Granino Arthur (January 2000). Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review. New York: Dover Publications. ISBN0-486-41147-8.
^ abcdefgThomas M. Cover; Joy A. Thomas (18 July 2006). Elements of Information Theory. Hoboken, New Jersey: Wiley. ISBN0-471-24195-4.