Answered step by step
Verified Expert Solution
Question
1 Approved Answer
1. Given two random variables X and Y and their joint distribution P(X,Y): X = 2 X = 4 X = 6 Y = Red
1. Given two random variables X and Y and their joint distribution P(X,Y): X = 2 X = 4 X = 6 Y = Red (2) 0.18 0.15 0.07 Y = Blue (4) 0.15 0.05 0.15 Y = Green (6) 0.04 0.16 0.05 Find: 1.a) H(X) (10 points) 1.b) H(Y) (10 points) 1.c.) D(X||Y) (10 points) 1.d) D(Y||X) (10 points) 1.e) H(X|Y) (10 points) 1.f) H(Y|X) (10 points) 1.g) H(X,Y) (10 points) 1.h) H(Y) - H(Y|X) (10 points) 1.i) I(X;Y) (10 points) 1.j) If X is the number of wheels on a vehicle and Y is the color of the vehicle, what does I(X;Y) tell us? (Let's say we can observe the number of wheels easily but not the color) (10 points) Bonus 2. Using a standard (balanced) die, the following sequence of numbers is rolled : 2, 4, 1, 3. Compute the amount of information in bits for this event (i.e. the minimal amount of information required to store this sequence). (10 points) 3. Consider two die, one being balanced, i.e. the probability of each of the 6 numbers is equal, and one which is loaded where the probability of 2 is 0.2, the probability of 6 is 0.4, and the probability of the other numbers are equal. Compute the entropy an event produced by each of the two dice. (10 points) 4. The entropy of a probability distribution is 3.6 bits. Is this enough information to calculate how many Hartleys the entropy of the same distribution is? If it can be calculated, how many Hartleys is it? How would the answer change if we were asking about nat-s? (10 points)
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started