Markov chain blog
2. (15 pts) Consider a Markov chain { Xn } with state space S = {0, 1, 2} and transition matrix and transition matrix P = O ON/H HN/H O (1) Let the mapping f : S - S satisfy f(0) = 0 and f(2) = 1 and assume that f(1) # f(2). If Yn = f(Xn), then when is { Yn } a Markov chain? Is {Yn } always a Markov chain? In other words, are functions of Markov chains always Markov chains?2. A Markov chain with state space {1, 2, 3} has transition probability matrix 0.6 0.3 0.1 [Pa = 0.3 0.3 0.4 0.4 0.1 0.5 (a) Is this Markov chain irreducible? Is the Markov chain recurrent or transient? Explain your answers. (1)] What is the period of state 1? Hence deduce the period of the remaining states. Does this Markov chain have a limiting distribution? (0) Consider a general three-state Markov chain with transition matrix P11 P12 P13 1?: 1021 1022 p23 P31 P32 P33 Give an example of a specic set of probabilities pm; for which the Markov chain is not irreducible (there is no single right answer to this, of course l]. 3. Researchers Hill and Barton used data collected on the results of 457 matches and found that the competitor wearing red won 248 times, whereas the competitor wearing blue won 209 times. We will carry out a simulation to assess whether or not the observed data provide evidence in support of the research conjecture. This simulation will employ the 35 strategy. Determine the statistic, simulate could-have-been outcomes of the statistic under the null model, and assess the strength of evidence against the null model by esti- mating the p-value or the standardized statistic. a. What is the statistic we will use? Calculate the observed value of the statistic in this study. b. Describe how you could use a coin to develop a null distribution to test our hypothesis. c. Use the One Proportion applet to test our hypothesis. Based on your simulation, find the p-value and write a conclusion. Also write down the mean and standard deviation