Consider the Markov chain consisting of the three states 0, 1, 2 and having transition probability matrix
Question:
Consider the Markov chain consisting of the three states 0, 1, 2 and having transition probability matrix
It is easy to verify that this Markov chain is irreducible. For example, it is possible to go from state 0 to state 2 since 0 → 1 → 2 That is, one way of getting from state 0 to state 2 is to go from state 0 to state 1 (with probability 12 ) and then go from state 1 to state 2 (with probability 14 ).
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Question Posted: