4.4. Let {a; : i = 1, 2, ...} be a probability distribution, and consider the Markov...

Question:

4.4. Let {a; : i = 1, 2, ...} be a probability distribution, and consider the Markov chain whose transition probability matrix is

image text in transcribed

What condition on the probability distribution { a; : i = 1, 2, ... } is necessary and sufficient in order that a limiting distribution exist, and what is this limiting distribution? Assume

a, > 0 and a2 > 0, so that the chain is aperiodic.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

An Introduction To Stochastic Modeling

ISBN: 9780126848878

3rd Edition

Authors: Samuel Karlin, Howard M. Taylor

Question Posted: