Let Y be a random variable with possible outcomes 0, 1, and p(Y = 1) = 1/2. Let X be a random variable with
Let Y be a random variable with possible outcomes 0, 1, and p(Y = 1) = 1/2. Let X be a random variable with possible outcomes X = a, b, c. Define p = (p(XaY = 1), p(X = b|Y = 1), p(X = c|Y = 1)) q= (p(XaY=0), p(X = b|Y = 0), p(x = c|Y = 0)). Suppose that 31 p=( 13 1 q= ( 5'5'5 (a) Using the definition of mutual information, show that for any choice of p, q, DKL (p||m) + DKL (q||m) I(X;Y) = 2 where m = p+q (b) Compute I(X;Y). (c) Let Z be a random variable with outcomes x = X, such that for all x = X and y = {0, 1}, p(Z = x|Y = y) = p(X = xlY = 1-y). Using part (a) compute I(Z: Y). Explain intuitively the relation of your answer to part (a). (d) Suppose p and q are as in part (a). Using part (b), or otherwise, give an example of a random variable Z with possible outcomes X satisfying I(X;Y) > I(Z;Y). Explain your answer in terms of the data-processing inequality. (e) Suppose p and q are as in part (a). Give an example of a random variable Z with possible outcomes x = X satisfying I(X;Y) < I(Z;Y). Explain your answer in terms of the data- processing inequality.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
a Proof using the definition of mutual information The mutual information IX Y is defined as IX Y x ...See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started