Question: Definition 12.82 Let X be a Markov chain with transition matrix P. The vector = (i : i S) is called an invariant
Definition 12.82 Let X be a Markov chain with transition matrix P. The vector π =
(πi : i ∈ S) is called an invariant distribution6 of the chain if:
(a) πi ≥ 0 for all i ∈ S, and P
i∈S πi = 1,
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
