36. The state of a process changes daily according to a two-state Markov chain. If the process...

Question:

36. The state of a process changes daily according to a two-state Markov chain. If the process is in state i during one day, then it is in state j the following day with probability Pi,j , where P0,0 = 0.4, P0,1 = 0.6, P1,0 = 0.2, P1,1 = 0.8 Every day a message is sent. If the state of the Markov chain that day is i then the message sent is “good” with probability pi and is “bad” with probability qi = 1−pi , i = 0, 1

(a) If the process is in state 0 on Monday, what is the probability that a good message is sent on Tuesday?

(b) If the process is in state 0 on Monday, what is the probability that a good message is sent on Friday?

(c) In the long run, what proportion of messages are good?

(d) Let Yn equal 1 if a good message is sent on day n and let it equal 2 otherwise.
Is {Yn, n  1} a Markov chain? If so, give its transition probability matrix. If not, briefly explain why not.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: