The following is the transition probability matrix of a Markov chain with states 1, 2, 3, 4

Question:

The following is the transition probability matrix of a Markov chain with states 1, 2, 3, 4 P =

.4 .3 .2 .1

.2 .2 .2 .4

.25 .25 .5 0

.2 .1 .4 .3

If X0 = 1

(a) find the probability that state 3 is entered before state 4;

(b) find the mean number of transitions until either state 3 or state 4 is entered.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: