Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Question: Buses arrive at a bus station with i.i.d. interarrival times following an exponential distribution with intensity ?. Alice arrives at the bus station at

Question:

Buses arrive at a bus station with i.i.d. interarrival times following an exponential distribution with intensity ?. Alice arrives at the bus station at a deterministic time t. a. (4pts) What is the expected waiting time for Alice until next bus comes? b. (6pts) Let ? be the time when the last bus arrived before time t. Show that t ? ? follows an exponential distribution with parameter ?. c. (6pts) Show that the expected interarrival time between the last bus which arrived before time t and the first bus which arrives after time t is 2 ? . Explain why it is different from the general expected interarrival time 1 ? .

image text in transcribedimage text in transcribedimage text in transcribed
2. A Markov chain with state space {1, 2, 3} has transition probability matrix 00 0.3 0.1 a: 0.3 0.3 0.4 0.4 0.1 0.5 (a) Is this Markov chain irreducible? Is the Markov chain recurrent or transient? Explain your answers. (b) What is the period of state 1? Hence deduce the period of the remaining states. Does this Markov chain have a limiting distribution? (c) Consider a general three-state Markov chain with transition matrix 3011 3012 1013 P = P21 P22 P23 1031 P32 P33 Give an example of a specic set of probabilities jag-'3; for which the Markov chain is not irreducible (there is no single right answer to this1 of course l]. 4. Consider a discrete-time Markov chain with the following probability transition matrix 0 0 0 0 7 1-3-y P = 1-I-VVO T 0 0 1 0 Is it possible to choose values for a and y so that the Markov chain has the following properties? In each case, state the values of a and y, or give a brief reason why it is not possible. (a) The Markov chain has period 2. [2) (b) The Markov chain is reducible. (c) The Markov chain has at least one transient state. UNN (d) The Markov chain has invariant distribution (1/4, 1/4, 1/4, 1/4).The diagrams below show three Markov chains, where arrows indicate a non-zero transition probability. A Markov Chain 1 State 1 State 2 State 3 B Markov Chain 2 State 1 State 2 State 3 State 4 C Markov Chain 3 State 1 State 2 State whether each of the chains is: e irreducible . periodic, giving the period. [3]

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image_2

Step: 3

blur-text-image_3

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Beginning Algebra A Text/Workbook

Authors: Charles P McKeague

2nd Edition

1483271242, 9781483271248

More Books

Students also viewed these Mathematics questions

Question

1. In what ways has flexible working revolutionised employment?

Answered: 1 week ago