In information theory, the conditional entropy (or equivocation) quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in shannons, nats, or hartleys. The entropy of conditioned on is written as .
If is the entropy of the variable conditioned on the variable taking a certain value , then is the result of averaging over all possible values that may take.
Note: The supports of X and Y can be replaced by their domains if it is understood that should be treated as being equal to zero.
if and only if the value of is completely determined by the value of . Conversely, if and only if and are independent random variables.
Assume that the combined system determined by two random variables X and Y has entropy , that is, we need bits of information to describe its exact state. Now if we first learn the value of , we have gained bits of information. Once is known, we only need bits to describe the state of the whole system. This quantity is exactly , which gives the chain rule of conditional entropy:
Formally, the chain rule indeed follows from the above definition of conditional entropy:
||This article needs attention from an expert in Mathematics. The specific problem is: [guess] perhaps concerns assertion regarding conditional quantum entropy. (August 2014)|
Bayes' rule for conditional entropy immediately follows:
Proof. and . Symmetry implies . Subtracting the two equations implies Bayes' rule. QED.
Generalization to quantum theory
For any and :
, where is the mutual information between and .
where is the mutual information between and .
For independent and :
Although the specific-conditional entropy, , can be either less or greater than , can never exceed when is the uniform distribution.