Markov chain
Consider a Markov chain {Xn, n = 0, 1, . ..} on the state space S = {0, 1, 2}. Suppose that the Markov chain has the transition matrix 2 10 10 10 2 P = 3 10 2 4 10 10 1. Show that the Markov chain has a unique stationary mass. 2. Let h denote the stationary mass of the Markov chain. Find h(x) for all x E S. 3. Show that the Markov chain has the steady state mass. 4. Let h* denote the steady state mass of the Markov chain. Find h*(x) for all x E S.For each of the following transition matrices, determine whether the Markov chain with that transition matrix is regular: (1) Is the Markov chain whose transition matrix whose transition matrix is 0 0.5 0.5 0.5 0 0.5 0 0 regular? (Yes or No) (2) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0.3 0 0.7 0 0 regular? (Yes or No) (3) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0.6 0 0.4 1 0 0 regular? (Yes or No) (4) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0 0.6 0 0.4 regular? (Yes or No) (5) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0.3 0.2 0.5 0 1 01. (a) Explain what is meant by the transition probability matrix of a homogeneous Markov chain. [5 marks] (b) Explain what is meant by the stationary distribution of a Markov chain? [5 marks] (c) A Markov chain has transition probability matrix, A, with entries Ouj; and stationary distribution . Write down an expression for the entries of the reverse Markov chain. [5 marks (d) Consider the following transition probability matrix of a homogo- neous Markov chain, with three states i,j and k (the TPM is in that order). If the stationary vector of the chain is (1/9, 2/9, 2/3), determine whether the Markov chain is reversible. 1 /0.2 0.2 0.6 0.1 0.6 0.3 4 \\0.1 0.1 0.8 [5 marks] (e) Let X1, X2, Xa be a sequence of random variables resulting from the above Markov chain. If X1 = i and Xs = j what is the probability that X2 = k? [5 marks]