2.1. Consider the Markov chain whose transition probability matrix is given by Suppose that the initial distribution

Question:

2.1. Consider the Markov chain whose transition probability matrix is given by

image text in transcribed

Suppose that the initial distribution is p; = 4 for i = 0, 1, 2, 3. Show that Pr(X,, = k} = ;, k = 0, 1, 2, 3, for all n. Can you deduce a general result from this example?

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

An Introduction To Stochastic Modeling

ISBN: 9780126848878

3rd Edition

Authors: Samuel Karlin, Howard M. Taylor

Question Posted: