Consider an irreducible finite Markov chain with states 0, 1, . . . , N. (a) Starting

Question:

Consider an irreducible finite Markov chain with states 0, 1, . . . , N.

(a) Starting in state i, what is the probability the process will ever visit state j? Explain!

(b) Let xi = P{visit state N before state 0|start in i}. Compute a set of linear equations that the xi satisfy, i = 0, 1, . . . , N.

(c) If



jjPij = i for i = 1, . . . , N − 1, show that xi = i/N is a solution to the equations in part (b).

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: