Question
Entropy formula H(S): Average # of bits Lavg: Efficiency: H(S)/Lavg 5.3 Arithmetic Coding, Huffman Coding, and LZW are three popular lossless compression methods. a) What
Entropy formula H(S):
Average # of bits Lavg:
Efficiency: H(S)/Lavg
5.3 Arithmetic Coding, Huffman Coding, and LZW are three popular lossless compression methods.
a) What are the advantages and disadvantages of Arithmetic Coding as compared to Huffman Coding?
b) Suppose the alphabet is [A;B;C;D], and the known probability distribution is PA = 0.4 PB =0.4 PC = 0.1 Pd = 0.1 . For simplicity, let's also assume that both encoder and decoder know that the length of messages is always 4, so there is no need for a terminator.
i) How many bits are needed to encode the message BBBB by Huffman coding?
ii) How many bits are needed to encode the message BBB by arithmetic coding?
iii) How many bits are needed to encode the message BBB by LZW considering that the initial dictionary A=00, B=01, C=10?
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started