Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Let X be a random variable consisting of two outcomes X1 and X2, both occurring with positive probability p(X1) = p and p(X2) = 1

Let X be a random variable consisting of two outcomes X1 and X2, both occurring with positive probability p(X1) = p and p(X2) = 1 p. The (Shannon) entropy of X is defined to be H(X) = p log2 1 p + (1 p) log2 1 1 p = p log2 (p) (1 p) log2 (1 p) . Note that H(X) > 0 since p and 1p are both strictly between 0 and 1, so their reciprocals exceed 1, and hence log2 (1/p) and log2 (1/(1 p)) are both positive.

(a) (2 marks) Suppose p(X1) = 1/4 and p(X2) = 3/4. Calculate H(X).

(b) (8 marks) Prove that if H(X) is maximal, then both outcomes are equally likely. Hint: First year calculus: consider H(X) as a function of p and determine for which value of p it takes on its maximum. Then find that maximum. (c) (2 marks) What is the maximal value of H(X)?

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Semantics In Databases Second International Workshop Dagstuhl Castle Germany January 2001 Revised Papers Lncs 2582

Authors: Leopoldo Bertossi ,Gyula O.H. Katona ,Klaus-Dieter Schewe ,Bernhard Thalheim

2003rd Edition

3540009574, 978-3540009573

More Books

Students also viewed these Databases questions