Consider a continuous-time Markov chain with infinitesimal transition matrix = (ij ) and equilibrium distribution .

Question:

Consider a continuous-time Markov chain with infinitesimal transition matrix Λ = (Λij ) and equilibrium distribution π. If the chain is at equilibrium at time 0, then show that it experiences t



i πiλi transitions on average during the time interval [0, t], where λi = 

j=i Λij .

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: