Venn diagram for various information measures associated with correlated variables X and Y. The area contained by both circles is the joint entropy H(X,Y). The circle on the left (red and violet) is the individual entropy H(X), with the red being the conditional entropy H(X|Y). The circle on the right (blue and violet) is H(Y), with the blue being H(Y|X). The violet is the mutual information I(X;Y).
In information theory, the conditional entropy (or equivocation) quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in shannons, nats, or hartleys. The entropy of conditioned on is written as .
If is the entropy of the variable conditioned on the variable taking a certain value , then is the result of averaging over all possible values that may take.
Given discrete random variables with Image and with Image , the conditional entropy of given is defined as: (Intuitively, the following can be thought as the weighted sum of for each possible value of , using as the weights)
Note: It is understood that the expressions 0 log 0 and 0 log (c/0) for fixed c>0 should be treated as being equal to zero.
Assume that the combined system determined by two random variables X and Y has joint entropy, that is, we need bits of information to describe its exact state. Now if we first learn the value of , we have gained bits of information. Once is known, we only need bits to describe the state of the whole system. This quantity is exactly , which gives the chain rule of conditional entropy:
The chain rule follows from the above definition of conditional entropy:
In general, a chain rule for multiple random variables holds:
It has a similar form to Chain rule (probability) in probability theory, except that addition instead of multiplication is used.