Let (X, n 0] denote an ergodic Markov chain with limiting probabilities. Define the process (Y, n

Question:

Let (X, n 0] denote an ergodic Markov chain with limiting probabilities. Define the process (Y, n 1] by Y = (x-1, X). That is, Y, keeps track of the last two states of the original chain. Is [Y, n 1) a Markov chain? If so, determine its transition probabilities and find lim P{Y, = (I,J))

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: