Markov chain
A Markov chain with state space {1, 2, 3} has transition probability matrix 0.6 0.3 0.1\\ P. = 0.3 0.3 0.4 0.4 0.1 0.5 (a) Is this Markov chain irreducible? Is the Markov chain recurrent or transient? Explain your answers. (b) What is the period of state 1? Hence deduce the period of the remaining states. Does this Markov chain have a limiting distribution? (c) Consider a general three-state Markov chain with transition matrix P11 P12 P13 P = P21 P22 P23 P31 P32 P33 Give an example of a specific set of probabilities p;; for which the Markov chain is not irreducible (there is no single right answer to this, of course !).For each of the following transition matrices, determine whether the Markov chain with that transition matrix is regular: (1) Is the Markov chain whose transition matrix whose transition matrix is 0 0.5 0.5 0.5 0 0.5 0 0 regular? (Yes or No) (2) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0.3 0 0.7 0 0 regular? (Yes or No) (3) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0.6 0 0.4 1 0 0 regular? (Yes or No) (4) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0 0.6 0 0.4 regular? (Yes or No) (5) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0.3 0.2 0.5 0 1 01. (a) Explain what is meant by the transition probability matrix of a homogeneous Markov chain. [5 marks] (b) Explain what is meant by the stationary distribution of a Markov chain? [5 marks] (c) A Markov chain has transition probability matrix, A, with entries Ouj; and stationary distribution . Write down an expression for the entries of the reverse Markov chain. [5 marks (d) Consider the following transition probability matrix of a homogo- neous Markov chain, with three states i,j and k (the TPM is in that order). If the stationary vector of the chain is (1/9, 2/9, 2/3), determine whether the Markov chain is reversible. 1 /0.2 0.2 0.6 0.1 0.6 0.3 4 \\0.1 0.1 0.8 [5 marks] (e) Let X1, X2, Xa be a sequence of random variables resulting from the above Markov chain. If X1 = i and Xs = j what is the probability that X2 = k? [5 marks]1. Consider the Markov chain with the following transition matrix. 0 0.5 0.5 0.5 0 0.5 0.5 0.5 0 (a) Draw the transition diagram of the Markov chain. (b) Is the Markov chain ergodic? Give a reason for your answer. (c) Compute the two step transition matrix of the Markov chain. (d) What is the state distribution *2 for t = 2 if the initial state distribution for t = 0 is no = (0.1, 0.5, 0.4)