*32. Let [Xn, > 0) denote an ergodic Markov chain with limiting probabilities . Define the...

Question:

*32. Let [Xn, ç > 0) denote an ergodic Markov chain with limiting probabilities ð. Define the process [Yni ç > 1) by Yn = (Xn-i, Xn). That is, Yn keeps track of the last two states of the original chain. Is {Yn9 ç > 1) a Markov chain? If so, determine its transition probabilities and find lim P{Yn = (ij)}

n->

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: