conditional entropy

Noun

 * 1)  The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.
 * The conditional entropy of random variable $$Y$$ given $$X$$ (i.e., conditioned by $$X$$), denoted as $$H(Y|X)$$, is equal to $$ H(Y) - I(Y;X) $$ where $$I(Y;X)$$ is the mutual information between $$Y$$ and $$X$$.