A Markov chain {Xn, n 0} with states 0, 1, 2, has the transition probability matrix
Question:
A Markov chain {Xn, n 0} with states 0, 1, 2, has the transition probability matrix
⎡
⎢
⎢
⎣
1 2
1 3
1 6
0 1 3
2 3
1 2 0 1 2
⎤
⎥
⎥
⎦
If P{X0 = 0} = P{X0 = 1} = 1 4 , find E[X3].
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Question Posted: