The state of a process changes daily according to a two-state Markov chain. If the process is
Question:
The state of a process changes daily according to a two-state Markov chain. If the process is in state i during one day, then it is in state j the following day with probability Pi,j , where
Every day a message is sent. If the state of the Markov chain that day is i then the message sent is “good” with probability pi and is “bad” with probability qi = 1 −pi , i = 0, 1
(a) If the process is in state 0 on Monday, what is the probability that a good message is sent on Tuesday?
(b) If the process is in state 0 on Monday, what is the probability that a good message is sent on Friday?
(c) In the long run, what proportion of messages are good?
(d) Let Yn equal 1 if a good message is sent on day n and let it equal 2 otherwise.
Is {Yn,n 1} a Markov chain? If so, give its transition probability matrix. If not, briefly explain why not.
Step by Step Answer: