Let {Xn, n 0} denote an ergodic Markov chain with limiting probabilities i . Define the process

Question:

Let {Xn, n 0} denote an ergodic Markov chain with limiting probabilities πi .

Define the process {Yn, n 1} by Yn = (Xn−1, Xn). That is, Yn keeps track of the last two states of the original chain. Is {Yn, n 1} a Markov chain? If so, determine its transition probabilities and find lim n→∞ P{Yn = (i, j)}

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: