Question
MATLAB Suppose a data source produces a series of characters drawn from a set of M distinct symbols. If symbol k is produced with probability
MATLAB
Suppose a data source produces a series of characters drawn from a set of M distinct symbols. If symbol k is produced with probability pk, the first-order entropy of the source is defined as
k=1 in this.
Essentially H1 is the number of bits needed per symbol to encode a long message; that is, measures the amount of information content, and therefore the potential success of compression strategies. The value H1 = 0 corresponds to the case of only one symbol being produced--no information-- while if all M symbols have equal probability, then H1 = log2M .
Write a function [H,M] = entropy(v) that computes entropy for a vector v. The probabilities should be computed emperically by finding the unique entries (using unique), then counting the occurences of each symbol and dividing by then legnth of v. Try your function on some built-in image data by entering load clown, v = X(:); .
H1 1pklogopkStep by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started