47. Let {Xn,n 0} denote an ergodic Markov chain with limiting probabilities i. Define the process {Yn,n
Question:
47. Let {Xn,n 0} denote an ergodic Markov chain with limiting probabilities
πi. Define the process {Yn,n 1} by Yn = (Xn−1,Xn). That is, Yn keeps track of the last two states of the original chain. Is {Yn,n 1} a Markov chain? If so, determine its transition probabilities and find lim n→∞ P{Yn = (i,j)}
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Question Posted: