34. Let P ( 1 ) and P ( 2 ) denote transition probability matrices for ergodic...

Question:

34. Let P ( 1 ) and P ( 2 ) denote transition probability matrices for ergodic Markov chains having the same state space. Let ð1 and ð2 denote the stationary (limiting) probability vectors for the two chains. Consider a process defined as follows:

(i) X0 = 1. A coin is then flipped and if it comes up heads, then the remaining states Xl9 ... are obtained from the transition probability matrix P ( 1 ) and if tails from the matrix P ( 2 ). Is {Xn, ç > 0} a Markov chain? If ñ = Pfcoin comes up heads), what is limw_.00P(Ar w = /)?

(ii) X0 = 1. At each stage the coin is flipped and if it comes up heads, then the next state is chosen according to Ñã and if tails comes up, then it is chosen according to P 2 . In this case do the successive states constitute a Markov chain? If so, determine the transition probabilities. Show by a counterexample that the limiting probabilities are not the same as in part (i).

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: