Consider a Markov chain with states 0, 1, 2, 3, 4. Suppose Po,4 = 1; and suppose
Question:
Consider a Markov chain with states 0, 1, 2, 3, 4.
Suppose Po,4 = 1; and suppose that when the chain is in state i, i > 0, the next state is equally likely to be any of the states 0, 1, ..., i - 1.
Find the limiting probabilities of this Markov chain.
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Question Posted: