Answers needed
What is the difference between univariate data and bivariate data? Choose the correct answer below. 0 A. In univariate data, a single variable is measured on each individual. In bivariate data, two variables are measured on each individual. 0 B. In univariate data, there are only positive values and zeros. In bivariate data, there are positive values, negative values, and zeros. O C. In univariate data, there is one mean. In bivariate data, there are two means. 0 D. In univariate data, the data are qualitative. In bivariate data, the data are quantitative. Consider a Markov chain {Xn, n = 0, 1, . ..} on the state space S = {0, 1, 2}. Suppose that the Markov chain has the transition matrix 2 10 10 10 2 P = 3 10 2 4 10 10 1. Show that the Markov chain has a unique stationary mass. 2. Let h denote the stationary mass of the Markov chain. Find h(x) for all x E S. 3. Show that the Markov chain has the steady state mass. 4. Let h* denote the steady state mass of the Markov chain. Find h*(x) for all x E S.Consider a standard chessboard with an 8 x 8 grid of possible locations. We define a Markov chain by randomly moving a single chess piece on this board. The initial location Xo is sampled uniformly among the 82 = 64 squares. At time t, the piece then chooses Xt+1 by sampling uniformly from the set of legal moves given its current location Xt. For a description of legal chess moves, see: http://en. wikipedia. org/wiki/Rules_of_chess#Basic_moves. a) Suppose the chess piece is a king, which can move to any of the 8 adjacent squares. Is the Markov chain irreducible? Is the Markov chain aperiodic? b) Suppose the chess piece is a bishop. Is the Markov chain irreducible? Is the Markov chain aperiodic? c) Suppose the chess piece is a knight. Is the Markov chain irreducible? Is the Markov chain aperiodic?2. Markov chain transitions P = [P/j] = Al- AI- NI- Al- NI- AI- NI- AI- Al- Let X1 be distributed uniformly over the states {0, 1, 2}. Let (Xill be a Markov chain with transition matrix P; thus, P(Xn+1=j \\Xn= i) = Pu , i, j E {0, 1, 2}. (a) Is the information source stationary? ~ (b) Find the stationary distribution of the Markov chain (c) Find the entropy rate of the Markov chain