B1.14

Information Theory
Part II, 2002

(a) Define the entropy h(X)h(X) and the mutual entropy i(X,Y)i(X, Y) of random variables XX and YY. Prove the inequality

0i(X,Y)min{h(X),h(Y)}0 \leqslant i(X, Y) \leqslant \min \{h(X), h(Y)\}

[You may assume the Gibbs inequality.]

(b) Let XX be a random variable and let Y=(Y1,,Yn)\mathbf{Y}=\left(Y_{1}, \ldots, Y_{n}\right) be a random vector.

(i) Prove or disprove by producing a counterexample the inequality

i(X,Y)j=1ni(X,Yj)i(X, \mathbf{Y}) \leqslant \sum_{j=1}^{n} i\left(X, Y_{j}\right)

first under the assumption that Y1,,YnY_{1}, \ldots, Y_{n} are independent random variables, and then under the assumption that Y1,,YnY_{1}, \ldots, Y_{n} are conditionally independent given XX.

(ii) Prove or disprove by producing a counterexample the inequality

i(X,Y)j=1ni(X,Yj)i(X, \mathbf{Y}) \geqslant \sum_{j=1}^{n} i\left(X, Y_{j}\right)

first under the assumption that Y1,,YnY_{1}, \ldots, Y_{n} are independent random variables, and then under the assumption that Y1,,YnY_{1}, \ldots, Y_{n} are conditionally independent given XX.