Answered step by step
Verified Expert Solution
Question
1 Approved Answer
The following formulas are also provided to save you the trouble of looking them up elsewhere: . Binomial coefficient n! K = k!(n - k)!
The following formulas are also provided to save you the trouble of looking them up elsewhere: . Binomial coefficient n! K = k!(n - k)! . Inclusion-exclusion principle P(AUB) = P(A) + P(B) - P(An B) . Chain rule P(AnB) = P(A|B) P(B) . Odds P(A) P(A) Odds(A) = P(A) ~ 1- P(A) . Conditional probability P(A B) = P(An B) P(B) . Bayes' theorem P(E H ) P(Hi) P(HE)= P(E Ho) P(Ho) + P(E|HI) P(H1) + . .. + P(EH,.) P(H,) . Information content I(A) = - 10g2 P(A) = log2(1/P(A)) . Entropy H(X) = -> P(x;) log2 P(xi) i= 11. (18 points) Below are two alphabets, each consisting of five symbols with varying probabilities. Alphabet 1 Alphabet 2 Symbol a b C d e Symbol a b C d e Frequency 0.2 0.1 0.1 0.2 0.4 Frequency 0.1 0.2 0.05 0.01 0.64 Below are two symbol codes for encoding the symbols a, b, c, d, e. Code A Code B Symbol a b C d e Symbol a b C d e Code word 101 11 1001 1000 0 Code word 11 101 100 01 00 (a) Among the symbols of both alphabets, which symbol in which alphabet has the greatest information content? What is that information content? (b) Find the entropy of alphabet 2. (c) Each of these two codes is a Huffman code for one of the two alphabets. Which code belongs to which alphabet? How do you know
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started