Venn diagram of information theoretic measures for three variables x, y, and z. Each circle represents an individual entropy: H(x) is the lower left circle, H(y) the lower right, and H(z) is the upper circle. The intersections of any two circles represents the mutual information for the two associated variables (e.g. I(x;z) is yellow and gray). The union of any two circles is the joint entropy for the two associated variables (e.g. H(x,y) is everything but green). The joint entropy H(x,y,z) of all three variables is the union of all three circles. It is partitioned into 7 pieces, red, blue, and green being the conditional entropies H(x|y,z), H(y|x,z), H(z|x,y) respectively, yellow, magenta and cyan being the conditional mutual informations I(x;z|y), I(y;z|x) and I(x;y|z) respectively, and gray being the multivariate mutual information I(x;y;z). The multivariate mutual information is the only one of all that may be negative.