Consider a Markov chain with transition probability matrix P., and sup- pose that P, increases in i

Question:

Consider a Markov chain with transition probability matrix P., and sup- pose that P, increases in i for all k.

(a) Show that, for all increasing functions

f, , P.,f(j) increases in i.

(b) Show that P, increases in i for all k, where P", are the n-step transition probabilities, n 2.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Stochastic Processes

ISBN: 9780471120629

2nd Edition

Authors: Sheldon M. Ross

Question Posted: