Markov chain irreducible
2. A Markov chain with state space {1, 2, 3} has transition probability matrix 00 0.3 0.1 a: 0.3 0.3 0.4 0.4 0.1 0.5 (a) Is this Markov chain irreducible? Is the Markov chain recurrent or transient? Explain your answers. (b) What is the period of state 1? Hence deduce the period of the remaining states. Does this Markov chain have a limiting distribution? (c) Consider a general three-state Markov chain with transition matrix 3011 3012 1013 P = P21 P22 P23 1031 P32 P33 Give an example of a specic set of probabilities jag-'3; for which the Markov chain is not irreducible (there is no single right answer to this1 of course l]. For each of the following transition matrices, determine whether the Markov chain with that transition matrix is regular: (1) Is the Markov chain whose transition matrix whose transition matrix is 1 0.2 0.8 0 0 regular? (Yes or No) (2) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0.1 0.4 0.5 0 1 regular? (Yes or No) (3) Is the Markov chain whose transition matrix whose transition matrix is 0 0 0.8 0 0.2 0.8 0.2 0 regular? (Yes or No) (4) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0.8 0 0.2 0 1 0 regular? (Yes or No)Consider a standard chessboard with an 8 x 8 grid of possible locations. We define a Markov chain by randomly moving a single chess piece on this board. The initial location Xo is sampled uniformly among the 82 = 64 squares. At time t, the piece then chooses Xt+1 by sampling uniformly from the set of legal moves given its current location Xt. For a description of legal chess moves, see: http://en. wikipedia. org/wiki/Rules_of_chess#Basic_moves. a) Suppose the chess piece is a king, which can move to any of the 8 adjacent squares. Is the Markov chain irreducible? Is the Markov chain aperiodic? b) Suppose the chess piece is a bishop. Is the Markov chain irreducible? Is the Markov chain aperiodic? c) Suppose the chess piece is a knight. Is the Markov chain irreducible? Is the Markov chain aperiodic?1. Consider the Markov chain with state space {0, 1, 2} and transition matrix 1 2 P = HO 2 (a) Suppose Xo = 0. Find the probability that X2 = 2. (b) Find the stationary distribution of the Markov chain. (c) What proportion of time does the Markov Chain spend in state 2, in the long run? (d) Suppose X5 = 1. What is the expected additional number of steps (after time 5) until the first time the Markov chain will return to state 1