Question: [22] Let (x) be a lower semicomputable probability measure. Suppose we define the cooccurrence of events and conditional events anew as follows: The probability of
[22] Let μ(x) be a lower semicomputable probability measure.
Suppose we define the cooccurrence of events and conditional events anew as follows: The probability of cooccurrence μ(x, y) is μ(x, y) =
μ(x) if y is a prefix of x; it is μ(x, y) = μ(y) if x is a prefix of y, and equals zero otherwise. The conditional probability is defined as
μ(y|x) = μ(x, y)/μ(x). Complexities are defined as follows: the unconditional Kμ(x) = log 1/μ(x); cooccurrence Kμ(x, y) = log 1/μ(x, y); and conditional Kμ(x|y) = log 1/μ(x|y).
Show that these complexities satisfy exactly the information-theoretic equality of symmetry of information: Kμ(x, y) = Kμ(x)+Kμ(y|x) (Sections 1.11, 2.8, and 3.8.1).
Comments. Similar (probability) definitions were used by D.G. Willis [J.
ACM, 17(1970), 241–259]. Source: [R.J. Solomonoff, Ibid.]. Solomonoff used Mnorm instead of an arbitrary measure μ.
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
