joint entropy

Noun

 * 1)  The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts.
 * If random variables $$X$$ and $$Y$$ are mutually independent, then their joint entropy $$H(X,Y)$$ is just the sum $$H(X) + H(Y)$$ of its component entropies. If they are not mutually independent, then their joint entropy will be $$H(X) + H(Y) - I(X;Y)$$ where $$I(X;Y)$$ is the mutual information of $$X$$ and $$Y$$.