7. A transition probability matrix is said to be doubly stochastic if $$ P_ij = 1...
Question:
7. A transition probability matrix is said to be doubly stochastic if
$$
Σ P_ij = 1
$$
for all states j = 0, 1,..., M. If such a Markov chain is ergodic, show that
П= 1/(М + 1), j = 0, 1, ..., Μ.
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Question Posted: