All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
elementary probability for applications
Questions and Answers of
Elementary Probability For Applications
17. (a) Solve recursively in terms of π1, guess the answer, and prove by induction. Alternatively, read part
16. Bishops move diagonally. Starting at a corner, a bishop can reach 32 vertices, of which 14 have degree 7, 10 have degree 9, 6 have degree 11, and 2 have degree 13.Finally, 40.
14. Solve the equation π = π P to find πk = ρkπ0, where ρ = p/q. The chain is positive recurrent if and only if p < q. In order to distinguish between transience and null recurrence, consider a
13. The quick way is to look for a solution of the detailed balance equations.
. Now, Wa(n) is the sum of a(n)independent, identically distributed random variables, and a(n)/n → 1/μ.12. 52 1 + 52 2 + · · · + 52 51 ≤ 52(1 + log 51). Use conditional probability and
11. Use the fact that Un ≥ x if and only if Wa(n) ≤ n, where Wk is the time of the kth return to i , and a(n) =(n/μ) + x pnσ2/μ3
10. Check that Q and π are in detailed balance.
9. (a) 8, (b) 1, (c) 10. Use Theorem 12.57 and symmetry for the last part.
8. Either slog it out, or consider a collapsed chain with two states, 1 and 0 (representing‘not 1’). Now use the result of Problem 12.13.2.
7. Assume reversibility. By passing the π term along the product, we may see thatπi1 pi1,i2 pi2,i3 · · · pin ,i1 = pi2,i1 pi3,i2 · · · pi1,inπi1 . For the converse, sum over the intermediate
6. Either solve the equation π = π P, or argue as follows. If the state is i jk, then i was the last book chosen and, of the books j and k, j was the last chosen. Therefore,πi j k = αiαj /(αj +
5. The Markov property holds by the lack-of-memory property of the geometric distribution(Problem 2.6.5). For the invariant distribution, look for a solution of the detailed balance equations (you
4. The chain is irreducible and P0(T > n) = bn, where T is the first return time of 0.Recall Problem 2.6.6. The final answer is πj = bjP i bi .
3. In ‘physical’models of this type, one can start by looking for a solution to the detailed balance equations. In this case, we have πi =????N i2????2N N.
2. Recall Example 12.15. The chain is reversible in equilibrium when 0 < αβ < 1.
1. πi = 1/N for i ∈ S.
16. Condition on the length of the service time.22. (c) The mean total is 1 2 tE(Nt ) = 1 2λt2, by either calculation or symmetry.Chapter 12
15. p(t) = e−λt cosh λt = q(t). The time to the first change of state has the exponential distribution with parameter λ. Use independence for the last part.
Rewrite this in terms of characteristic functions and appeal to the continuity theorem for a watertight argument.
14. The following argument is not completely rigorous but is illuminating. Let t →∞in the formula for G(s, t) given in Problem 11.7.13 to obtain G(s, t) → exp[ρ(s − 1)]where ρ = θ/μ. This
(b) The answer is 0 if a < 1 and a/(1 +a) if a ≥ 1.
(iii) X − Y has the bilateral exponential distribution.
(ii) min{X, Y} has the exponential distribution with parameter 2λ.
13. (a) (i) 1 − e−λX is uniformly distributed on (0, 1).
12. Draw the regions in question in the (x, y)-plane. It is useful to prove that R2 =X2 + Y 2 and 2 = tan−1(Y/X) are independent, having an exponential and uniform distributions, respectively.(a) 1
9. fY (y) = 1 4 (3y + 1)e−y for 0 < y < ∞.11. Use Theorem 6.62 with g(x, y) =p x2 + y2 and change to polar coordinates. The variance equals σ2(2 − 1 2π).
6. If you can do Problem 6.9.4 then you should be able to do this one.P(U ≤ x, V ≤ y) = F(y)n − [F(y) − F(x)]n for x < y
5. Show that G(y) = P(Y > y) satisfies G(x + y) = G(x)G(y), and solve this equation. The corresponding question for integer-valued random variables appears at Problem 2.6.5.
4. min{X, Y} > u if and only if X > u and Y > u.
1. For the first part, find the joint density function of X and XY by the method of change of variables, and then find the marginal density function of XY.2. No.3. The region(x, y, z) : √4xz < y
14. Find P(Y ≤ y) for y ∈ R.
2 . This fails to occur if and only if the disjoint union A0 ∪ A1 ∪ · · · ∪ An occurs, where A0 is the event there is no break in (0, 1 2 ], and Ak is the event of no break in (Xk , Xk + 1 2
what is the
13. Let Xk be the position of the kth break (in no special order). The pieces form a polygon if no piece is longer than the sum of the other lengths, which is equivalent to each piece having length
what is the
12. Assume that the centre is uniform on the rectangle [0, a] × [0, b], and that the acute angle θ between the needle and a line of the first grid is uniform on [0, 1 2π]. There is no intersection
what is the
11. This distance has distribution function (2/π) tan−1 x for 0 ≤ x < ∞.
what is the
10. fY (y) =3(1 − y)2 exp−y + 2 1 − yfor −2 < y < 1.
define of the
7. Integrate by parts. You are proving that E(X) =R P(X > x) dx, the continuous version of Problem 2.6.6.8. Apply the conclusion of Problem 5.8.7 to Y = g(X), express the result as a double integral
6. Note that x ≤ F(y) if and only if F−1(x) ≤ y, whenever 0 < F(y) < 1.
11. This is essentially a reprise of Problem 4.5.8.3. n ≥ 4.4. fY (y) = √2/π exp(−1 2 y2) for y > 0. √2/π and 1 − (2/π).5. Let F−1(y) = sup{x : F(x) = y}. Find P(F(X) ≤ y).
10. P(A wins) = a/(a + b − ab). The mean number of shots is (2 − a)/(a + b − ab).
9. This is an alternative derivation of the result of Problem 3.6.12.
2 )n+1 pn−r (2 − p)r+1.6. For the third part, find the real part of GX (θ), where θ is a primitive complex root of unity.8. GX (s) = GN ( 1 2 + 1 2 s), giving by independence that G = GN
3. 9 19 , 6 19 , 4 19 . The mean number of throws is 3.4. [q/(1 − ps)]N . The variance is Np(1 − p)−2.5. The first part of this problemmay be done by way of Theorem4.36, with N+1 having a
1. Note that P(X = k) = uk−1 − uk .2. ( 1 6 )713!6! 7! − 49.
6 . To locate the extremal probabilities, find the number of ways in which the various possible outcomes can occur.For example, P(X = x) is maximized at x = 7. To verify (in)dependence, it is
16. (a) The means are 7 and 0, and both variances equal 35
15. var(Un) = (n − 1)pq − (3n − 5)( pq)2.
14. Condition on the value of N. X has the Poisson distribution with parameter λp.
13. c[1 − (1 − c−1)n], by using indicator functions.
12. In calculating the mean, remember that the expectation operator E is linear. The answer here is c1 + 1 2 + 1 3 + · · · + 1 c, a much more elegant solution than that proposed for Problem
11. Adapt the hint for Problem 3.6.10.
10. Let Zi be the indicator function that the i th box is empty. The total number of empty boxes is S = Z1 + Z2 + · · · + ZM. Also, E(Zi ) = (M − 1)N /MN and E(S) =ME(Z1).
7. Use Theorem 2.42 with Bi = {N = i − 1}.9. (a) 1 2 , (b) 1 6 (3√5 − 1), (c) 5 6 .
6. Let 1k be the indicator function of the event that, when there are 2k ends, a new hoop is created at the next step. Then E(1k ) = k????2k 2= 1/(2k − 1). The mean final number of hoops is Pn
5. P(U > k) = P(X > k)P(Y > k).
4. P(Un = k) = P(Un ≥ k) − P(Un ≥ k + 1), and P(Un ≥ k) =1 −k − 1 Nn.
1. Use the result of Exercise 1.35, with Theorem 3.27.2. a = b = 1 2 . No.
9. (1 − pn)/[pn(1 − p
8. This is sometimes called Banach’s matchbox problem. First, condition on which pocket is first emptied. You may find the hint more comprehensible if you note that 2(n − h)ph = (2n − h) ph+1.
7. This generalizes the result of Problem 2.6.6.
6. The summation here is P∞k=0 P∞i=k+1 P(X = i ). Change the order of summation.For the second part, use the result of Exercise 1.20.
5. For the last part, show that G(n) = P(X > n) satisfies G(m + n) = G(m)G(n), and solve this relation.
4. α < −1 and c = 1/ζ(−α), where ζ(p) =P k k−p is the Riemann zeta function.
2. Use Theorem 2.42 with X and Bi chosen appropriately. The answer is m(r ) = r/p.3. E(X2) =P x2P(X = x), the sum of non-negative terms.
19. Show n = 6.
16. Conditional probabilities again. The answer is 1 4 (2e−1 + e−2 + e−4).18.Sn i=1 Ai →S∞i=1 Ai as n →∞.
15. Use the result of Problem 1.11.14(a).
14. (a) Induction. (b) Let Ai be the event that the i th key is hung on its own hook.
. Use the Partition Theorem 1.48 to obtain the difference equations. Either iterate these directly to solve them, or set up a matrix recurrence relation, and iterate this.
12. To do this rigorously is quite complicated. You need to show that the proportion 1 10 is correct for any single one of the numbers 0, 1, 2, . . . , 9.
10. 1 − (1 − p)(1 − p2)2 and 1 − (1 − p)(1 − p2)2 − p + p[1 − (1 − p)2]2.
9. If X and Y are the numbers of heads obtained, P(X = Y ) =X kP(X = k)P(Y = k) =X kP(X = k)P(Y = n − k)= P(X + Y = n).
1. Expand (1 + x)n + (1 − x)n.2. No.6. 79 140 and 40 61 .7. 11 50 .8. √3/(4πn)( 27 32 )n.
Show that (Yn : n ≥ 0) is also a simple, symmetric random walk.Let Mn = max{Xi : 0 ≤ i ≤ n}. Explain why {Mn ≥ a} = {Ta ≤ n} for a ≥ 1. By using the process (Yn : n ≥ 0) constructed
For each integer a ≥ 1, let Ta = inf{n ≥ 0 : Xn = a}. Show that Ta is a stopping time.Define a random variable Yn by the rule Yn =(Xn if n < Ta, 2a − Xn if n ≥ Ta.
18. Let (Xn : n ≥ 0) be a simple, symmetric random walk on the integers {. . . ,−1, 0, 1, . . . }, with X0 = 0 and P????Xn+1 = i ± 1 Xn = i= 1 2 .
(b) Suppose the frog starts on pad k and stops when she returns to it. Show that the expected number of times the frog hops is e(k − 1)!, where e = 2.718 . . . . What is the expected number of
(a) Find the equilibrium distribution of the corresponding Markov chain.
17. A frog inhabits a pond with an infinite number of lily pads, numbered 1, 2, 3 . . . . She hops from pad to pad in the following manner: if she happen to be on pad i at a given time, she hops to
12.13 Problems 249 earlier moves. Let Xn be her position after n moves. Show that (Xn : n ≥ 0) is a reversible Markov chain, and find its invariant distribution.What is the mean number of moves
16. An erratic bishop starts at the bottom left of a chess board and performs random moves. At each stage, she picks one of the available legal moves with equal probability, independently of
Show that the Hk satisfy a second-order difference equation, and hence find Hk . (Cambridge 2009)
Deduce the invariant distribution. (Oxford 2005)* 14. Consider a Markov chain with state space S = {0, 1, 2, . . . } and transition matrix given by pi, j =(qp j−i+1 for i ≥ 1 and j ≥ i − 1,
Showing 1 - 100
of 2468
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Last