Question: [31] (a) Let be a probability measure and G(n) = Ef(1:n) with f(1:n) = log (1:n) + log 1/Mc(1:n)). (Ef(1:n) denotes the expectation

[31]

(a) Let ν be a probability measure and G(n) = Ef(ω1:n)

with f(ω1:n) = log ν(ω1:n) + log 1/Mc(ω1:n)). (Ef(ω1:n) denotes the νexpectation 

l(x)=n ν(x)f(x).) Show that limn→∞ G(n) = ∞.

(b) Let ν be a computable probability measure, and let F be a computable function such that limn→∞ F(n) = ∞. Show that there is a constant c such that for all binary x of length n we have log ν(x) +

log 1/Mc(x) < c + F(n).

Comments. Hint: use Exercises 4.5.8 and 4.5.9 to solve Item (a). We can interpret log 1/Mc(x) as the length of the Shannon–Fano code using distribution Mc, and log 1/ν(x) as the length of the Shannon–Fano code using the actually reigning ν. Clearly, although log(ν/Mc) approaches infinity with n, it does so more slowly than any computable function. In contrast, log(ν/M), or log(ν/Mnorm), is bounded by a constant. Source:

[R.J. Solomonoff, Ibid.].

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Elementary Probability For Applications Questions!