Answered step by step
Verified Expert Solution
Question
1 Approved Answer
STAT 430 2GR, 2UG Spring 2016 A. Stepanov Homework #1 (due Friday, February 5, by 4:00 p.m.) Please include your name ( with your last
STAT 430 2GR, 2UG Spring 2016 A. Stepanov Homework #1 (due Friday, February 5, by 4:00 p.m.) Please include your name ( with your last name underlined ), and your NetID at the top of the first page. 1. HPS 1.7 2. Let { X n : n 0 } be a Markov chain with the state space S and the transition probability function P ( x, y ). Find the expression for P ( X 1 = y | X 0 = x, X 2 = z ), x, y, z S. 3. A medical researcher is studying the risk of heart attack in men. She first divides men into three weight categories: thin, normal, and overweight. By studying the male ancestors, sons, and grandsons of these men, the researcher comes up with the following transition probability matrix. Thin Normal Overweight Thin 0.30 0.50 0.20 Normal 0.10 0.60 0.30 Overweight 0.10 0.50 0.40 a) Find the probabilities of the following for a man of normal weight. i) Thin son b) Find the probabilities of the following for an overweight man. i) Overweight son c) Suppose that the distribution of men by weight is initially given by [ 0.15 0.60 0.25 ]. ii) Thin grandson ii) Overweight grandson iii) Thin great-grandson iii) Overweight great-grandson Find each of the following distributions. i) After 1 generation ii) After 2 generations iii) After 3 generations 4. Let { X n : n = 0, 1, 2, 3, ... } be a Markov chain with state space { 0, 1, 2 } and transition probability matrix 0 1 2 0 0.50 0.30 0.20 1 0.60 0.40 0 2 0 0 1 Starting from 0, what is the probability that the process never enters 1? 5. A fair die is tossed repeatedly. The maximum of the first n outcomes is denoted by X n . Is { X n : n = 1, 2, 3, ... } a Markov chain? Why or why not? If it is a Markov chain, specify the states, find the transition probability matrix, and determine which states are recurrent and which are transient. 6. HPS 1.19 8. Suppose that whether or not a bidder is successful on a bid depends on the successes and 7. HPS 1.20 failures of his previous two bids. If his last two bids were successful, his next bid will be successful with probability 0.50. If only one of his last two bids was successful, the probability is 0.60 that the next bid will be successful. Finally, if none of the last two bids were successful, the probability is 0.7 that the next one will be successful. If, at step n, the state of the process is defined based on whether bid n is a success or a failure, then the process is NOT a Markov chain. Define X n to be sf, if at step n the last bid was a failure and the bid before that was a success. Define ss, fs, and ff similarly. Let state 0 = ff, state 1 = fs, state 2 = sf, and state 3 = ss. Then { X n : n 2 } is a Markov chain. Find the transition probability matrix. Hint: P ( X n + 1 = 1 | X n = 2 ) = 0.60; P ( X n + 1 = 3 | X n = 2 ) = 0, since it is not possible to move from sf to ss in one step
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started