Consider a Markov chain with states 1, 2, 3 and transition probability matrix (a) Let m(j, 1)

Question:

Consider a Markov chain with states 1, 2, 3 and transition probability matriximage text in transcribed

(a) Let m(j, 1) be the mean number of transitions, starting in state j , until the chain makes a transition into state 1. Find m(j, 1), j = 1, 2, 3.

(b) Find the stationary probabilities of this Markov chain.

(c) Given that X0 = 3, find the expected number of transitions until the pattern 1, 2, 2, 1, 3, 1, 2, 2, 1 appears.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: