mutual information

Noun

 * 1)  A measure of the entropic (informational) correlation between two random variables.
 * Mutual information $$I(X;Y)$$ between two random variables $$X$$ and $$Y$$ is what is left over when their mutual conditional entropies $$H(Y|X)$$ and $$H(X|Y)$$ are subtracted from their joint entropy $$H(X,Y)$$. It can be given by the formula $$I(X;Y) = - \sum_x \sum_y p_{X,Y} (x,y) \log_b {p_{X,Y} (x,y) \over p_{X|Y} (x|y) p_{Y|X} (y|x)}$$.

Translations

 * Chinese:
 * Mandarin: 互信息, 互資訊