Long Question: Markov chain In many applications, some variables evolve over time in a random (stochastic) way in that even if you know everything up to time t, it is still impossible to know for sure which values the variables take in the future. Stochastic processes are mathematical models that describe these phenomena when the randomness is driven by something out of the control of the relevant decision makers. For example, the stock prices can be (and usually are) modeled as stochastic processes, since it is difficult for an investor to affect them. However, in a chess game, the uncertainty in your opponents' future moves are not modeled as stochastic processes as they are made by a decision maker with the goal to defeat you and they may be adjusted based on your moves. One simple and widely used class of stochastic processes are Markov chains. In this question, we study Markov chains on a nite state space. There is a mquence of random variables x0, x1, ..., at, ..., each taking value in a nite set 3 called the state Space. The subscripts have the interpretation of time. Therefore given an integer time t, :rg, ..., as; are assumed to be known at the time, while n+1, n+2, remain random. For convenience, we label states with positive integers: S = {1, ...,n}, where n is the number of possible states. (a) Is S a good choice of sample space for describing this stochastic process? No matter what the answer is, for the rest of the question we assume that a good sample space 9 (which might be S if your answer is ' es\") has been chosen to carry the random variables. (b) The fundamental assumption of Markov chains is that given mg, ..., mg, the probability that n+1 : j is pig where i is the value of at. This holds for every t : 0, 1, and the numbers pjz' are independent of t. More precisely, for any xed t and the event A that mg : a0, ..., mt : at (where a9, a1, ..., at are integers in S], Phi#10) (7 A] = P(A)pj,a,; this holds for all a5, a1, ..., (lg. (The socalled Markov property means that the value of n+1 only depends on x; and not on history further back.) Let P be the n X a matrix whose (j, i) entry is pig for all i,j E S; it is called the transition matrix of the Markov chain. Show that for the probabilities to be welldened for all laws of mg, each column of P sums up to one. (c) A law on the state space S can be represented by an n x 1 matrix (column vector), whose (i, 1) entry is the probability of {i}. A function f : S } R can be represented by a l X a matrix whose (l, i] entry is the number f(i). (Notice that f is not a random variable unless your answer to Part (a) is ' es".) If the law of mg is no, then what is the interpretation of f'Uo (matrix product)? (d) In what follows, x a law on of 9:0 and a function f : S )- IR. What is the law of 931? (e) Notice that fP is a lxn matrix, so it represents a function on S. What does that function mean intuitively? (As we can see, the transformation from f to fP is another way to describe the transition matrix P, and this transformation plays an important role when studying more complicated Markov processes.) (f) What is the law of act