Prove the properties of entropy enumerated in Sec. 4.11.4. Data from Sec. 4.11.4 Because of the similarity
Question:
Prove the properties of entropy enumerated in Sec. 4.11.4.
Data from Sec. 4.11.4
Because of the similarity of the general formulas for information and entropy (both proportional to ∑n −pn ln pn), information has very similar properties to entropy.
1. Information is additive (just as entropy is additive). The information in two successive, independent messages is the sum of the information in each message.
2. If the frequencies of occurrence of the symbols in a message are pn = 0 for all symbols except one, which has pn = 1, then the message contains zero information. This is analogous to the vanishing entropy when all states have zero probability except for one, which has unit probability.
3. For a message L symbols long, whose symbols are drawn from a pool of N distinct symbols, the information content is maximized if the probabilities of the symbols are all equal (pn = 1/N), and the maximal value of the information is I = L log2N. This is analogous to the microcanonical ensemble having maximal entropy.
Step by Step Answer:
Modern Classical Physics Optics Fluids Plasmas Elasticity Relativity And Statistical Physics
ISBN: 9780691159027
1st Edition
Authors: Kip S. Thorne, Roger D. Blandford