Markov chain transition
(p) All Markov chains must have a finite number of states. (q) All irreducible Markov chains must have a finite number of states. (r) All irreducible Markov chains are periodic. (s) All irreducible Markov chains are aperiodic. (t) All discrete-time Markov chains are irreducible.Consider a Markov chain {Xn, n = 0, 1, . ..} on the state space S = {0, 1, 2}. Suppose that the Markov chain has the transition matrix 2 10 10 10 2 P = 3 10 2 4 10 10 1. Show that the Markov chain has a unique stationary mass. 2. Let h denote the stationary mass of the Markov chain. Find h(x) for all x E S. 3. Show that the Markov chain has the steady state mass. 4. Let h* denote the steady state mass of the Markov chain. Find h*(x) for all x E S.4. Consider the Markov chain X" = {X,} with state space S = {0, 1, 2, ...} and transition probabilities 1 ifj=i-1 Puj = 10 otherwise , for i 2 1 and Poo = 0, Poj = for j > 1. (a) Is this Markov chain irreducible? Determine the period for every state. (b) Is the Markov chain recurrent or transient? Explain. (c) Is the Markov chain positive recurrent? If so, compute the sta- tionary probability distribution. (d) For each state i, what is the expected number of steps to return to state i if the Markov chain X starts at state i? 5. Consider a Markov chain X = {X} with state space S = {0, 1, 2, ...} and transition probability matrix 0 1 0 0 P 0 0 P = O p 0 q 0 0 . . . 0 0 P 0 4 0 Here p > 0, q > 0 and p+q =1. Determine when the chain is positive recurrent and compute its stationary distribution.. RightTriangle changes state In setBase or setHeight public void setBase [int newBase) { this base - newBase; setHypotenuse ( ) ; setChanged ( ) ; notifyObservers () ; public void setHeight [int newHeight) this . height - newHeight; getHypotenuse () ; setChanged ( ) ; notifyObservers ()