25. Consider a Markov chain with states 0, 1,2, 3, 4. Suppose P04 = 1; and suppose...
Question:
25. Consider a Markov chain with states 0, 1,2, 3, 4. Suppose P04 = 1;
and suppose that when the chain is in state /, / > 0, the next state is equally likely to be any of the states 0 , 1 , . . . , / - 1. Find the limiting probabilities of this Markov chain.
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Question Posted: