Consider aMarkov chain with states 1, 2, 3 having transition probability matrix (a) If the chain is

Question:

Consider aMarkov chain with states 1, 2, 3 having transition probability matriximage text in transcribed

(a) If the chain is currently in state 1, find the probability that after two transitions it will be in state 2.

(b) Suppose you receive a reward r(i) = i2 whenever the Markov chain is in state i, i = 1, 2, 3. Find your long run average reward per unit time.
Let Ni denote the number of transitions, starting in state i, until the Markov chain enters state 3.

(c) Find E[N1].

(d) Find P(N1 ≤ 4).

(e) Find P(N1 = 4).

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: