Answered step by step
Verified Expert Solution
Question
1 Approved Answer
In many applications, some variables evolve over time in a random (stochastic) way in that even if you know everything up to time t,
In many applications, some variables evolve over time in a random (stochastic) way in that even if you know everything up to time t, it is still impossible to know for sure which values the variables take in the future. Stochastic processes are mathematical models that describe these phenomena when the randomness is driven by something out of the control of the relevant decision makers. For example, the stock prices can be (and usually are) modeled as stochastic processes, since it is difficult for an investor to affect them. However, in a chess game, the uncertainty in your opponents' future moves are not modeled as stochastic processes as they are made by a decision maker with the goal to defeat you and they may be adjusted based on your moves. One simple and widely used class of stochastic processes are Markov chains. In this question, we study Markov chains on a finite state space. There is a sequence of random variables ro, ,...,,.... each taking value in a finite set S called the state space. The subscripts have the interpretation of time. Therefore given an integer time t, o, are assumed to be known at the time, while +1, +2,... remain random. For convenience, we label states with positive integers: S [1... n}, where n is the number of possible states. (a) Is S a good choice of sample space for describing this stochastic process? No matter what the answer is, for the rest of the question we assume that a good sample space (which might be S if your answer is "yes") has been chosen to carry the random variables. = (b) The fundamental assumption of Markov chains is that given xo....., the probability that +1 = j is pji where i is the value of at. This holds for every t=0, 1.... and the numbers pj, are independent of t. More precisely, for any fixed t and the event A that zo ao.... tat (where ao, a, a, are integers in S), P(a) A) = P(A)p,.,; this holds for all ao, a1, a. (The so-called Markov property means that the value of. *1+1 only depends on , and not on history further back.) - Let P be the nxn matrix whose (j, i) entry is p., for all i, je S; it is called the transition matria of the Markov chain. Show that for the probabilities to be well-defined for all laws of co. each column of P sums up to one.. YA (c) A law on the state space S can be represented by an x 1 matrix (column vector), whose (i. 1) entry is the probability of (i). A function : SR can be represented by a 1 x n matrix whose (1, 1) entry is the number f(). (Notice that is not a random variable unless your answer to Part. (a) is "yes".) If the law of ro is to, then what is the interpretation of fuo (matrix product)? (d) In what follows, fix a law to of zo and a function f:SR. What is the law of r? (e) Notice that fP is a 1xn matrix, so it represents a function on S. What does that function mean intuitively? (As we can see, the transformation from f to fP is another way to describe the transition matrix P. and this transformation plays an important role when studying more complicated Markov processes.) (f) What is the law of ?
Step by Step Solution
★★★★★
3.44 Rating (147 Votes )
There are 3 Steps involved in it
Step: 1
answers A It depends For example if we are only interested in the set of possible values that a part...Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started