Question: The transition matrix for a Markov chain is P= [0.7 0 0.3] [0 1 0] [0.2 0 0.8] (A) Show that R = [.40.6] and

The transition matrix for a Markov chain is

P= [0.7 0 0.3]

[0 1 0]

[0.2 0 0.8]

(A) Show that R = [.40.6] and S = [010] are both stationary matrices for P. Explain why this does not contradict Theorem 1A.

(B)Find another stationary matrix for P. [Hint: Consider T = aR + (1 a)S, where 0

(C)How many different stationary matrices does P have?

INFORMATION

THEOREM 1 PROPERTIES OF REGULAR MARKOV CHAINS *Let P be the transition matrix for a regular Markov chain. (A)There is a unique stationary matrix S that can be found by solving the equation SP = S (B)Given any initial-state matrix S0, the state matrices Sk approach the stationary matrix S. (C)The matrices Pk approach a limiting matrix P, where each row of P is equal to the stationary matrix S.

Barnett, Raymond A. Finite Mathematics for Business, Economics, Life Sciences, and Social Sciences, 2nd Edition. Pearson Learning Solutions, 2013-04-13. VitalBook file.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!