1. A player throws a fair die and simultaneously flips a fair coin. If the coin lands...

Question:

• 1. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, then one-half of the value that appears on the die. Determine her expected winnings.
• 2. The game of Clue involves 6 suspects, 6 weapons, and 9 rooms. One of each is randomly chosen and the object of the game is to guess the chosen three.

(a) How many solutions are possible?
In one version of the game, after the selection is made each of the players is then randomly given three of the remaining cards. Let S, W, and R be, respectively, the numbers of suspects, weapons, and rooms in the set of three cards given to a specified player. Also, let X denote the number of solutions that are possible after that player observes his or her three cards.

(b) Express X in terms of S, W, and R.

(c) Find E[X].

• 3. If X and Y are independent uniform (0, 1) random variables, show that $$E[|X - Y|] = \frac{2}{(a + 1)(a + 2)}$$ for a > 0 • 4. Let X and Y be independent random variables, both being equally likely to be any of the values 1, 2,..., m. Show that $$E[|X - Y|] = \frac{(m - 1)(m + 1)}{3m}$$
• 5. The county hospital is located at the center of a square whose sides are 3 miles wide. If an accident occurs within this square, then the hospital sends out an ambulance. The road network is rectangular, so the travel distance from the hospital, whose coordinates are (0, 0), to the point (x, y) is |x| + |y|.
If an accident occurs at a point that is uniformly distributed in the square, find the expected travel distance of the ambulance.
• 6. A fair die is rolled 10 times. Calculate the expected sum of the 10 rolls.
• 7. Suppose that A and B each randomly, and independently, choose 3 of 10 objects. Find the expected number of objects

(a) chosen by both A and B;

(b) not chosen by either A or B;

(c) chosen by exactly one of A and B.
• 8. N people arrive separately to a professional dinner. Upon arrival, each person looks to see if he or she has any friends among those present. That person then either sits at the table of a friend or at an unoccupied table if none of those present is a friend. Assuming that each of the $$(\frac{N}{2})$$ pairs of people are, independently, friends with probability p, find the expected number of occu-
pied tables.
HINT: Let Xi equal 1 or 0 depending on whether the ith arrival sits at a previously unoccupied table.
• 9. A total of n balls, numbered 1 through n, are put into n urns, also numbered 1 through n in such a way that ball i is equally likely to go into any of the urns 1, 2,..., i. Find

(a) the expected number of urns that are empty;

(b) the probability that none of the urns is empty.
• 10. Consider 3 trials, each having the same probability of success. Let X denote the total number of successes in these trials. If E[X] = 1.8, what is

(a) the largest possible value of P(X = 3);

(b) the smallest possible value of P(X = 3)?
In both cases construct a probability scenario that results in P(X = 3) having the stated value.
HINT: For part

(b) you might start by letting U be a uniform random variable on (0, 1) and then defining the trials in terms of the value of U.

• 11. Consider n independent flips of a coin having probability p of landing heads.
Say that a changeover occurs whenever an outcome differs from the one preceding it. For instance, if  n = 5 and the outcome is HHTHT, then there is a total of 3 changeovers. Find the expected number of changeovers.
HINT: Express the number of changeovers as the sum of n - 1 Bernoulli random variables.
• 12. A group of n men and m women are lined up at random. Determine the expected number of men that have a woman on at least one side of them.
HINT: Define an indicator random variable for each man.
• 13. Repeat Problem 12 when the group is seated at a round table.
• 14. An urn has m black balls. At each stage a black ball is removed and a new ball, that is black with probability p and white with probability 1 - p, is put in its place. Find the expected number of stages needed until there are no more black balls in the urn.
NOTE: The above has possible applications to understanding the AIDS dis-
ease. Part of the body's immune system consists of a?

--- OCR Start ---
• 1,..., r. Compute the expected number of balls that are withdrawn before ball number 1.
• 21. For a group of 100 people compute

(a) the expected number of days of the year that are birthdays of exactly 3 people;

(b) the expected number of distinct birthdays.
• 22. How many times would you expect to roll a fair die before all 6 sides appeared at least once?
• 23. Urn 1 contains 5 white and 6 black balls, while urn 2 contains 8 white and 10 black balls. Two balls are randomly selected from urn 1 and are then put in urn 2. If 3 balls are then randomly selected from urn 2, compute the expected number of white balls in the trio.
HINT: Let X1 if the ith white ball initially in urn 1 is one of the three selected, and let X, 0 otherwise. Similarly, let Y, 1 if the ith white ball from urn 2 is one of the three selected, and let Y, 0 otherwise. The number of white balls in the trio can now be written as 5 8 +
• 24. A bottle initially contains w large pills and a small pills. Each day a patient randomly chooses one of the pills. If a small pill is chosen, then that pill is eaten. If a large pill is chosen, then the pill is broken in two; one part is returned to the bottle (and is now considered a small pill) and the other part is then eaten.

(a) Let X denote the number of small pills in the bottle after the last large pill has been chosen and its smaller half returned. Find E[X].
HINT: Definen+m indicator variables, one for each of the small pills initially present and one for each of them small pills created when a large one is split in two. Now use the argument of Example 2m.

(b) Let Y denote the day on which the last large pill is chosen. Find E[Y].
HINT: What is the relationship between X and Y?
• 25. Let X1, X2.... be a sequence of independent and identically distributed continuous random variables. Let N≥ 2 be such that X₁ X2 XN-1HINT: First find P{N}.
• 26. If X1, X2, X, are independent and identically distributed random variables having uniform distributions over (0, 1), find

(a) Elmax(X1,...,X)];

(b) Elmin (X₁...., X)].
• 27. In Problem 6, calculate the variance of the sum of the rolls.
• 28. In Problem 9, compute the variance of the number of empty urns.
• 29. If E[X] 1 and Var(X) 5 find

(a) E[(2 + X)²];

(b) Var(4 + 3X).
• 30. If 10 married couples are randomly seated at a round table, compute

(a) the expected number and

(b) the variance of the number of wives who are seated next to their husbands.
• 31. Cards from an ordinary deck are turned face up one at a time. Compute the expected number of cards that need be turned face up in order to obtain

(a) 2 aces;

(b) 5 spades;

(c) all 13 hearts.
• 32. Let X be the number of 1's and Y the number of 2's that occur in n rolls of a fair die. Compute Cov(X, Y).
• 33. A die is rolled twice. Let X equal the sum of the outcomes, and let Y equal the first outcome minus the second. Compute Cov(X, Y).
• 34. The random variables X and Y have a joint density function given by $$f(x,y)=\begin{cases}
2e^{-2x/x} & 0\leq x<\infty, 0\leq y\leq x\\
0 & otherwise \end{cases}$$
Compute Cov(X, Y).
• 35. Let X1, be independent with common mean µ and common variance o², and set Y = X + X+1+X+2. For j≥ 0, find Cov(Υπ. Υπ+j).
• 36. The joint density function of X and Y is given by $$f(x,y)=\frac{1}{y}e^{-(x+y)/y}, x>0, y> 0$$
Find E[X], E[Y], and show that Cov(X, Y) 1.
• 37. A pond contains 100 fish, of which 30 are carp. If 20 fish are caught, what are the mean and variance of the number of carp among these 20? What assumptions are you making?
• 38. A group of 20 people-consisting of 10 men and 10 women are randomly arranged into 10 pairs of 2 each. Compute the expectation and variance of the number of pairs that consist of a man and a woman. Now suppose the 20 people consisted of 10 married couples. Compute the mean and variance of the number of married couples that are paired together.
• 39. Let X1, X2, X be independent random variables having an unknown continuous distribution function F, and let Y1, Y2, Y be independent random variables having an unknown continuous distribution function G.
Now order those + variables and let $$I_i= \begin{cases}
1 & \text{if the ith smallest of the } n+m \text{ variables is from the } X \text{ sample}\\
0 & \text{otherwise}
\end{cases}$$
--- OCR End ---

The random variable R = +
Σ
is the sum of the ranks of the X sample and is the basis of a standard statistical procedure (called the Wilcoxon sum of ranks test) for testing whether F and G are identical distributions. This test accepts the hypothesis that F G when R is neither too large nor too small. Assuming that the hypothesis of equality is in fact correct, compute the mean and variance of R.
HINT: Use the results of Example 3d.
• 40. There are two distinct methods for manufacturing certain goods, the quality of goods produced by method i being a continuous random variable having distribution F,, i = 1, 2. Suppose that a goods are produced by method 1 and m by method 2. Rank the n+m goods according to quality and let x=2 if the ith best was produced from method 1 otherwise 1, For the vector X1, X2, Xn which consists of a l's and m 2's, let R denote the number of runs of 1. For instance, if n 5, m2, and X 2, 1, 1, 1, 1, 2, then R 2. If F₁ F2 (that is, if the two methods produce identically distributed goods), what are the mean and variance of R?
• 41. If X1, X2, X3, X4 are (pairwise) uncorrelated random variables each having mean O and variance 1, compute the correlations of

(a) X₁ + X₂ and X2 + X3;

(b) X₁ + X₂ and X3 + X4.
to • 42. Consider the following dice game, as played at a certain gambling casino:
Players 1 and 2 roll in turn a a pair of dice. The bank then rolls the dice determine the outcome according to the following: player i, 1, 2, wins if his roll is strictly greater than the bank's. Let for i = 1, 2, I = {
if i wins 0 otherwise and show that I, and I₂ are positively correlated. Explain why this result was to be expected.
• 43. Consider a graph having a vertices labeled 1, 2, n, and suppose that between each of the pairs of distinct vertices an edge is, independently, present with probability p. The degree of vertex i, designated as D,, is the number of edges that have vertex i as one of its vertices.

(a) What is the distribution of D₁?

(b) Find p(D,, Dj), the correlation between D; and Dj.
• 44. A fair die is successively rolled. Let X and Y denote, respectively, the number of rolls necessary to obtain a 6 and a 5. Find

(a) E[X];

(b) E[X/Y = 1];

(c) E[XY = 5].
• 45. An urn contains 4 white and 6 black balls. Two successive random samples of sizes 3 and 5, respectively, are drawn from the urn without replacement.
Let X and Y denote the number of white balls in the two samples, and compute E[XY], fori 1, 2, 3, 4.
• 46. The joint density of X and Y is given by e-xlye-y f(x, y) -
y Compute E[X2Y = y].
0• 47. The joint density of X and Y is given by f(x, y) = 0y Compute E[X³ Y-y].
• 48. A population is made up of r disjoint subgroups. Let p, denote the propor-
tion of the population that is in subgroup i, i 1,..., r. If the average weight of the members of subgroup i is w, i 1,..., r, what is the aver-
age weight of the members of the population?
• 49. A prisoner is trapped in a cell containing 3 doors. The first door leads to a tunnel that returns him to his cell after 2 days travel. The second leads to a tunnel that returns him to his cell after 4 days travel. The third door leads to freedom after 1 day of travel. If it is assumed that the prisoner will always select doors 1, 2, and 3 with respective probabilities .5, 3, and 2, what is the expected number of days until the prisoner reaches freedom?
• 50. Consider the following dice game. A pair of dice are rolled. If the sum is 7, then the game ends and you win 0. If the sum is not 7, then you have the option of either stopping the game and receiving an amount equal to that sum or starting over again. For each value of i, i 2, 12, find your expected return if you employ the strategy of stopping the first time that a value at least as large as i appears. What value of i leads to the largest expected return?
HINT: Let X, denote the return when you use the critical value i. To compute ELX,], condition on the initial sum:
• 51. Ten hunters are waiting for ducks to fly by. When a flock of ducks flies overhead, the hunters fire at the same time, but each chooses his target at random, independently of the others. If each hunter independently hits his target with probability 6, compute the expected number of ducks that are hit. Assume that the number of ducks in a flock is a Poisson random variable with mean 6.
• 52. The number of people that enter an elevator on the ground floor is a Poisson random variable with mean 10. If there are N floors above the ground floor and if each person is equally likely to get off at any one of these N floors, independently of where the others get off, compute the expected number of stops that the elevator will make before discharging all of its passengers.

• 53. Suppose that the expected number of accidents per week at an industrial plant is 5. Suppose also that the numbers of workers injured in each accident are independent random variables with a common mean of 2.5. If the number of workers injured in each accident is independent of the number of accidents that occur, compute the expected number of workers injured in a week.
• 54. A coin having probability *p* of coming up heads is continually flipped until both heads and tails have appeared. Find

(a) the expected number of flips;

(b) the probability that the last flip lands heads.
• 55. A person continually flips a coin until a run of 3 consecutive heads appears.
Assuming that each flip independently lands heads with probability *p*, determine the expected number of flips required.
HINT: Let *T* denote the first flip that lands on tails and let it be 0 if all flips land on heads, and then condition on *T*.
• 56. There are *n* + 1 participants in a game. Each person, independently, is a winner with probability *p*. The winners share a total prize of 1 unit. (For instance, if 4 people win, then each of them receives $\frac{1}{4}$, whereas if there are no winners, then none of the participants receive anything.) Let *A* denote a specified one of the players, and let *X* denote the amount that is received by *A*.

(a) Compute the expected total prize shared by the players.

(b) Argue that $E[X] = \frac{1 - (1-p)^{n+1}}{n+1}$.

(c) Compute *E[X]* by conditioning on whether *A* is a winner, and conclude that $$E[(1 + B)^{-1}] = \frac{1 - (1-p)^{n+1}}{(n+1)p}$$
when *B* is a binomial random variable with parameters *n* and *p*.
• 57. Each of *m* + 2 players pays 1 unit to a kitty in order to play the following game. A fair coin is to be flipped successively *n* times, where *n* is an odd number, and the successive outcomes noted. Each player writes down, before the flips, a prediction of the outcomes. For instance, if *n* = 3, then a player might write down (H, H, T), which means that he or she predicts that the first flip will land heads, the second heads, and the third tails. After the coins are flipped, the players count their total number of correct predictions. Thus, if the actual outcomes are all heads, then the player who wrote (H, H, T)
would have 2 correct predictions. The total kitty of *m* + 2 is then evenly split up among those players having the largest number of correct predictions.
Since each of the coin flips is equally likely to land on either heads or tails, *m* of the players have decided to make their predictions in a totally random fashion. Specifically, they will each flip one of their own fair coins *n*
times and then use the result as their prediction. However, the final 2 of the players have formed a syndicate and will use the following strategy. One of them will make predictions in the same random fashion as the other *m*
players, but the other one will then predict exactly the opposite of the first.
That is, when the randomizing member of the syndicate predicts an *H*, the other member predicts a *T*. For instance, if the randomizing member of the syndicate predicts (H, H, T), then the other one predicts (T, T, H).

(a) Argue that exactly one of the syndicate members will have more than *n*/2 correct predictions. (Remember, *n* is odd.)

(b) Let *X* denote the number of the nonsyndicate players that have more than *n*/2 correct predictions. What is the distribution of *X*?

(c) With *X* as defined in part (b), argue that $$E[\text{payoff to the syndicate}] = (m+2)E[\frac{1}{X+1}]$$

(d) Use part

(c) of Problem 56 to conclude that $$E[\text{payoff to the syndicate}] = \frac{2(m+2)}{m+1}[1-(\frac{1}{2})^{n+1}]$$
and explicitly compute this when *m* = 1, 2, and 3.
As it can be shown that $$\frac{2(m+2)}{m+1}[1-(\frac{1}{2})^{n+1}] > 2$$
it follows that the syndicate's strategy always gives it a positive expected profit.
• 58. Let *U*1, *U*2,... be a sequence of independent uniform (0, 1) random variables.
In Example 4h we showed that for 0 ≤ *x* ≤ 1, *E[N(x)]* = *ex*, where $$N(x) = min{n: \sum_{i=1}^{n}U_{i} > x}$$
This problem gives another approach to establishing this result.

(a) Show by induction on *n* that for 0 < *x* ≤ 1 and all *n* ≥ 0, $$P(N(x) ≤ n + 1) = \frac{x^{n}}{n!}$$
HINT: First condition on *U*1 and then use the induction hypothesis.

(b) Use part

(a) to conclude that $$E[N(x)] = e^{x}$$
• 59. An urn contains 30 balls, of which 10 are red and 8 are blue. From this urn, 12 balls are randomly withdrawn. Let *X* denote the number of red, and *Y* the number of blue, balls that are withdrawn. Find Cov(*X*, *Y*)

(a) by defining appropriate indicator (that is, Bernoulli) random variables *Xi*, *Yi* such that $X = \sum_{i=1}^{10}X_{i}$, $Y = \sum_{i=1}^{8}Y_{i}$

(b) by conditioning (on either *X* or *Y*) to determine *E[XY]*.
--- OCR End ---

• 60. Type i light bulbs function for a random amount of time having mean µ, and standard deviation σι, i 1, 2. A light bulb randomly chosen from a bin of bulbs is a type 1 bulb with probability p, and a type 2 bulb with probability 1 p. Let X denote the lifetime of this bulb. Find

(a) E[X]

(b) Var(X).
61. In Example 4c compute the variance of the length of time until the miner reaches safety.
• 62. The dice game of craps was defined in Problem 26 of Chapter 2. Compute

(a) the mean and

(b) the variance of the number of rolls of the dice that it takes to complete one game of craps.
• 63. Consider a gambler who at each gamble either wins or loses her bet with probabilities p and 1 p. When p >, a popular gambling system, known as the Kelley strategy, is to always bet the fraction 2p 1 of your current fortune. Compute the expected fortune after a gambles of a gambler who starts with x units and employs the Kelley strategy.
• 64. The number of accidents that a person has in a given year is a Poisson random variable with mean A. However, suppose that the value of A changes from person to person, being equal to 2 for 60 percent of the population and 3 for the other 40 percent. If a person is chosen at random, what is the probability that he will have

(a) 0 accidents and

(b) exactly 3 accidents in a year? What is the conditional probability that he will have 3 accidents in a given year, given that he had no accidents the preceding year?
• 65. Repeat Problem 64 when the proportion of the population having a value of A less than x is equal to 1 ex • 66. Consider an urn containing a large number of coins and suppose that each of the coins has some probability p of turning up heads when it is flipped.
However, this value of p varies from coin to coin. Suppose that the composition of the urn is such that if a coin is selected at random from the urn, then its p-value can be regarded as being the value of a random variable that is uniformly distributed over [0, 1]. If a coin is selected at random from the urn and flipped twice, compute the probability that

(a) the first flip is a head;

(b) both flips are heads.
• 67. In Problem 66, suppose that the coin is tossed a times. Let X denote the number of heads that occur. Show that $$P(X = i) = \frac{n!}{i!(n-i)!}p^i(1-p)^{(n-i)},$$
i = 0, 1, ..., n HINT: Make use of the fact that $$\int_0^1 x^{a-1}(1-x)^{b-1} dx = \frac{(a-1)!(b-1)!}{(a+b-1)!},$$
when a and b are positive integers.
• 68. Suppose that in Problem 66 we continue to flip the coin until a head appears.
Let N denote the number of flips needed. Find

(a) P(N = i), i ≥ 0;

(b) P(N);

(c) E[N].
• 69. In Example 5b let S denote the signal sent and R the signal received.

(a) Compute E[R].

(b) Compute Var(R).

(c) Is R normally distributed?

(d) Compute Cov(R, S).
• 70. In Example 5c, suppose that X is uniformly distributed over (0, 1). If the discretized regions are determined by $$a_0 = 0, a_1 = b_1 = a_2 = 1$$, determine the optimal quantizer Y and compute E[(XY).
• 71. The moment generating function of X is given by $$M_x(t) = exp(2e^{t^2} - 2)$$
and that of Y by $$M_y(t)$$. If X and Y are independent, what are

(a) P(X + Y = 2);

(b) P(XY = 0);

(c) E[XY]?
• 72. Let X be the value of the first die and Y the sum of the values when two dice are rolled. Compute the joint moment generating function of X and Y.
• 73. The joint density of X and Y is given by $$f(x, y) = \frac{1}{\sqrt{2\pi}}e^{-y/2}e^{-(x-y)^2/2}, $$
0 < y < ∞, -∞ < x < ∞

(a) Compute the joint moment generating function of X and Y.

(b) Compute the individual moment generating functions.
• 74. Two envelopes, each containing a check, are placed in front of you. You are to choose one of the envelopes, open it, and see the amount of the check. At this point you can either accept that amount or you can exchange it for the check in the unopened envelope. What should you do? Is it possible to devise a strategy that does better than just accepting the first envelope?
Let A and B, A < B, denote the (unknown) amounts of the checks, and note that the strategy that randomly selects an envelope and always accepts its check has an expected return of (A + B)/2. Consider the following strategy:
Let F(-) be any strictly increasing (that is, continuous) distribution function.
Randomly choose an envelope and open it. If the discovered check has value x then accept it with probability F(x), and with probability 1 - F(x)
exchange it.

(a) Show that if you employ the latter strategy, then your expected return is greater than (A + B)/2.
HINT: Condition on whether the first envelope has value A or B.
Now consider the strategy that fixes a value x, and then accepts the first check if its value is greater than x and exchanges it otherwise.

(b) Show that for any x, the expected return under the x-strategy is always at least (A + B)/2, and that it is strictly larger than (A + B)/2 if x lies between A and B.

(c) Let X be a continuous random variable on the whole line, and consider the following strategy: Generate the value of X, and if X - x then employ the x-strategy of part (b). Show that the expected return under this strategy is greater than (A + B)/2.
THEORETICAL EXERCISES • 1. Show that E[(Xa)²] is minimized at a = E[X].
• 2. Suppose that X is a continuous random variable with density function

f. Show that E[Xa] is minimized when a is equal to the median of F.
HINT: Write $$E[X-a] = \int_{}^{} (x-a) f(x) dx$$
Now break up the integral into the regions where x < a and where x >a and differentiate.
• 3. Prove Proposition 2.1 when

(a) X and Y have a joint probability mass function;

(b) X and Y have a joint probability density function and g(x, y) = 0 for all x, y.
• 4. Let X be a random variable having finite expectation µ and variance σ², and let g(-) be a twice differentiable function. Show that $$E[g(X)] = g(µ) + \frac{g''(µ)}{2} σ²$$
HINT: Expand g(-) in a Taylor series about µ. Use the first three terms and ignore the remainder.
• 5. Let A1, A2... A, be arbitrary events, and define C (at least k of the A, occur). Show that $$\sum_{k=1}^{n} P(C_k) = \sum_{i=1}^{n} P(A_i)$$
HINT: Let X denote the number of the A, that occur. Show that both sides of the above are equal to E[X].
• 6. In the text we noted that $$E[\sum_{i=1}^{n} X_i] = \sum_{i=1}^{n} E[X_i]$$
when the X, are all nonnegative random variables. Since an integral is a limit of sums, one might expect that $$E[\int_{0}^{∞} X(t) dt] = \int_{0}^{∞} E[X(t)] dt$$
whenever X(1), 0, are all nonnegative random variables; and this result is indeed true. Use it to give another proof of the result that, for a nonnegative random variable X, $$E[X) = \int_{0}^{∞} P(X>1) dt$$
HINT: Define, for each nonnegative t, the random variable X(1) by $$X(t) = \begin{cases}
1 &\text{if } t < X\\
0 &\text{if } t \ge X \end{cases}$$
Now relate X(1) dt to X.
• 7. We say that X is stochastically larger than Y, written X Y, if for all 1, $$P(X> r) \ge P(Y > 1)$$
Show that if X Y, then E[X] [Y] when

(a) X and Y are nonnegative random variables;

(b) X and Y are arbitrary random variables.
HINT: Write X as $$X = X^+ - X^-$$
where $$X^+ = \begin{cases}
X &\text{if } X \ge 0\\
0 &\text{if } X < 0 \end{cases}$$
$$X^- = \begin{cases}
0 &\text{if } X \ge 0\\
-X &\text{if } X < 0 \end{cases}$$
Similarly, represent Y as YY. Then make use of part (a).
• 8. Show that X is stochastically larger than Y if and only if $$E[f(X)] = E[/(Y)]$$
for all increasing functions f.
HINT: If X, Y, show that E[f(X)] E[/(Y)] by showing that f(X)st f(Y) and then using Theoretical Exercise 7. To show that E[f(X)) E[f(Y)]
for all increasing functions fimplies that P(X) = P(Y > r), define an appropriate increasing function /.
• 9. A coin having probability p of landing heads is flipped a ti es. Compute the expected number of runs of heads of size 1, of size 2, of: ek, 1≤ k ≤ n.
--- OCR End ---

• 10. Let X1, X2, ..., Xn be independent and identically distributed positive random variables. Find, for k ≤ n, $$
E\left[ \sum_{i=1}^k X_i \right]
$$
$$
\sum_{i=1}^k E[X_i]
$$
• 11. Consider n independent trials each resulting in any one of r possible outcomes with probabilities P1, P2, ..., Pr. Let X denote the number of outcomes that never occur in any of the trials. Find E[X] and show that among all probability vectors P1, ..., Pr, E[X] is minimized when Pi = 1/r, i = 1, ..., r.
• 12. Independent trials are performed. If the ith such trial results in a success with probability Pi, compute

(a) the expected number, and

(b) the variance, of the number of successes that occur in the first n trials. Does independence make a difference in part (a)? In part (b)?
• 13. Let X1, X2, ..., Xn be independent and identically distributed continuous random variables. We say that a record value occurs at time j, j ≤ n, if Xj ≥ Xi for all 1 ≤ i ≤ j. Show that

(a) E[number of record values] = $$
\sum_{j=1}^n 1/j $$;

(b) Var[number of record values] = $$
\sum_{j=1}^n \left( 1 - \frac{1}{j} \right)^2 $$
• 14. For Example 2j show that the variance of the number of coupons needed to amass a full set is equal to $$
\sum_{j=1}^{N-1} \frac{jN}{(N-j)^2}
$$
When N is large, this can be shown to be approximately equal (in the sense that their ratio approaches 1 as N → ∞) to N22/6).
• 15. Consider n independent trials, the ith of which results in a success with probability Pi.

(a) Compute the expected number of successes in the n trials—call it μ.

(b) For fixed value of μ, what choice of P1, ..., Pn maximizes the variance of the number of successes?

(c) What choice minimizes the variance?
• 16. Suppose that balls are randomly removed from an urn initially containing n white and m black balls. It was shown in Example 2m that E[X] = 1 + m/(n + 1), when X is the number of draws needed to obtain a white ball.

(a) Compute Var(X).

(b) Show that the expected number of balls that need be drawn to amass a total of k white balls is k[1 + m/(n + 1)].
HINT: Let Yi, i = 1, ..., n + 1, denote the number of black balls withdrawn after the (i)st white ball and before the (i + 1)st white ball. Argue that the Yi, i = 1, ..., n + 1, are identically distributed.
• 17. Suppose that X1 and X2 are independent random variables having a common mean μ. Suppose also that Var(X1) = σ12 and Var(X2) = σ22. The value of μ is unknown and it is proposed to estimate μ by a weighted average of X1
and X2. That is, λX1 + (1 − λ)X2 will be used as an estimate of μ, for some appropriate value of λ. Which value of λ yields the estimate having the lowest possible variance? Explain why it is desirable to use this value of λ.
• 18. In Example 3g we showed that the covariance of the multinomial random variables N1 and N2 is equal to mP1P2 by expressing N1 and N2 as the sum of indicator variables. This result could also have been obtained by using the formula Var(N1 + N2) = Var(N1) + Var(N2) + 2 Cov(N1, N2)

(a) What is the distribution of N1 + N2?

(b) Use the identity above to show that Cov(N1, N2) = mP1P2.
• 19. If X and Y are identically distributed, not necessarily independent, show that Cov(X + Y, XY) = 0 • 20. The Conditional Covariance Formula. The conditional covariance of X and Y, given Z, is defined by Cov(X, Y|Z) = E[(X − E[X|Z])(Y − E[Y|Z])|Z]

(a) Show that Cov(X, Y|Z) = E[XYZ] − E[X|Z]E[Y|Z]

(b) Prove the conditional covariance formula Cov(X, Y) = E[Cov(X, Y|Z)] + Cov(E[X|Z], E[Y|Z]).

(c) Set X = Y in part

(b) and obtain the conditional variance formula.
• 21. Let X(i), i = 1, ..., n, denote the order statistics from a set of n uniform (0, 1) random variables and note that the density function of X(i) is given by f(x) = $$
\frac{(n-1)!}{(i-1)!(n-i)!}x^{i-1}(1-x)^{n-i}
$$, 0 ≤ x ≤ 1

(a) Compute Var(X(i)), i = 1, ..., n.

(b) Which value of i minimizes and which value maximizes Var(X(i))?
• 22. If Y = a + bX, show that $$
p(X,Y) = \begin{cases}
+1 & \text{if } b > 0 \\
-1 & \text{if } b < 0 \end{cases}
$$
• 23. If Z is a unit normal random variable and if Y is defined by Y = a +
bZ + cZ2, show that $$
p(Y,Z) = \frac{b}{\sqrt{b^2 + 2c^2}}
$$
• 24. Prove the Cauchy-Schwarz inequality, namely, that $$
(E[XY])^2 ≤ E[X^2]E[Y^2]
$$

HINT: Unless Y = -X for some constant, in which case this inequality holds with equality, if follows that for all ????, 0 ≤ ????[(???? + ????)²] = ????[????²]² + 2????[????????] + ????[????²]
Hence the roots of the quadratic equation ????[????²]² + 2????[????????] + ????[????²] = 0 must be imaginary, which implies that the discriminant of this quadratic equation must be negative.
• 25. Show that if X and Y are independent, then ????[????????|????] = ????????[????]
for all ????

(a) in the discrete case;

(b) in the continuous case.
• 26. Prove that ????[????(????)????|????] = ????(????)????[????|????].
• 27. Prove that if ????[????|???? = ????] = ????[????] for all ????, then X and Y are uncorrelated, and give a counterexample to show that the converse is not true.
HINT: Prove and use the fact that ????[????????] = ????[????????[????|????]].
• 28. Show ????????????(????, ????[????|????]) = ????????????(????, ????).
• 29. Let ????₁, … , ????ₙ be independent and identically distributed random vari-
ables. Find ????[????₁ + … + ????ₙ = ????]
• 30. Consider Example 3g, which is concerned with the multinomial distribution.
Use conditional expectation to compute ????[????₁????₂] and then use this to verify the formula for ????????????(????₁, ????₂) given in Example 3g.
• 31. An urn initially contains ???? black and ???? white balls. At each stage we add ????
black balls and then withdraw, at random, ???? from the ???? + ???? + ????. Show that ????[number of white balls after stage ????] = (???? + ????)/(???? + ???? + ????) ????
• 32. Prove Equation (6.1b).
• 33. A coin, which lands on heads with probability ????, is continually flipped.
Compute the expected number of flips that are made until a string of ???? heads in a row is obtained.
HINT: Condition on the time of the first occurrence of tails, to obtain the equation ????[????] = (1 − ????)(Σ ????ⁱ⁻¹ + ????[????]) + (1 − ????)(Σ ????ⁱ⁻¹ ????)
Simplify and solve for ????[????].
• 34. For another approach to Theoretical Exercise 33, let ????ᵣ denote the number of flips required to obtain a run of ???? consecutive heads.

(a) Determine ????[????₁|????₁₋₁].

(b) Determine ????[????ᵣ] in terms of ????[????ᵣ₋₁].

(e) What is ????[????₁]?

(d) What is ????[????ᵣ]?
• 35.

(a) Prove that ????[????] = ????[????|???? < ????]????(???? < ????) + ????[????|???? = ????]????(???? = ????)
HINT: Define an appropriate random variable and then compute ????[????] by conditioning on it.

(b) Use part

(a) to prove Markov’s inequality, which states that if ????(???? = 0) = 1, then for ???? > 0, ????(???? ≥ ????) ≤
????[????]
????
• 36. One ball at a time is randomly selected from an urn containing ???? white and ???? black balls until all of the remaining balls are of the same color. Let ????ₐ,ₙ
denote the expected number of balls left in the urn when the experiment ends.
Compute a recursive formula for ????ₐ,ₙ and solve when ???? = 3, ???? = 5.
• 37. An urn contains ???? white and ???? black balls. After a ball is drawn, it is returned to the urn if it is white; but if it is black, it is replaced by a white ball from another urn. Let ????ₙ denote the expected number of white balls in the urn after the foregoing operation has been repeated ???? times.

(a) Derive the recursive equation ????ₙ₊₁ = (1 − ????/(???? + ????))????ₙ + 1

(b) Use part

(a) to prove that ????ₙ = ???? + ???? − ????(1 − ????/(???? + ????))ⁿ

(e) What is the probability that the (???? + 1)st ball drawn is white?
• 38. The best linear predictor of ???? with respect to ????₁ and ????₂ is equal to ???? +
????????₁ + ????????₂, where ????, ????, and ???? are chosen to minimize Determine ????, ????, and ????.
????[(???? − (???? + ????????₁ + ????????₂))²]
• 39. The best quadratic predictor of ???? with respect to ???? is ???? + ???????? + ????????², where ????, ????, and ???? are chosen to minimize ????[(???? − (???? + ???????? + ????????²))²]. Determine ????, ????, and ????.
• 40. ???? and ???? are jointly normally distributed with joint density function given by 1 ????(????, ????) = 2????σₓσᵧ√(1 − ????²)
× ????????????
− 1/2 (1 − ????²)[(???? − ????ₓ)²/σₓ² + (???? − ????ᵧ)²/σᵧ²
− 2????(???? − ????ₓ)(???? − ????ᵧ)/(σₓσᵧ)]

(a) Show that the conditional distribution of Y, given X = x, is normal with mean µy + σx (x - µx) and variance σ²(1 - p²).
σε

(b) Show that Corr(X, Y) = p.

(c) Argue that X and Y are independent if and only if p = 0.
• 41. Let X be a normal random variable with parameters µ = 0 and σ² = 1 and let I, independent of X, be such that P{I = 1} = P{I = 0}. Now define Y by
X if I = 1 Y = -X if I = 0
In words, Y is equally likely to equal either X or -X.

(a) Are X and Y independent?

(b) Are I and Y independent?

(c) Show that Y is normal with mean 0 and variance 1.

(d) Show that Cov(X, Y) = 0.
• 42. It follows from Proposition 5.1 and the fact that the best linear predictor of Y with respect to X is µy + p σx (x - µx) that if
E[Y|X] = a + bX
then
a = µy - p σx µx b = p σx σy (Why?) Verify this directly.
• 43. For random variables X and Z show that
E[(X - Y)²] = E[X²] - E[Y²]

where
Y = E[XZ]

• 44. Consider a population consisting of individuals able to produce offspring of the same kind. Suppose that each individual will, by the end of its lifetime, have produced j new offspring with probability Pj, j ≥ 0, independently of the number produced by any other individual. The number of individuals initially present, denoted by Xo, is called the size of the zeroth generation.
All offspring of the zeroth generation constitute the first generation, and their number is denoted by X₁. In general, let X, denote the size of the wth generation. Let µ = ∑ JP, and σ² = ∑ (j - µ)²P, denote, respectively, j=0 j=0 the mean and the variance of the number of offspring produced by a single individual. Suppose that X = 1 -that is, initially there is a single individual in the population.

(a) Show that
E[X,|X₁-1] = µE[X₁-1]

(b) Use part,

(a) to conclude that
E[X] = µ^

(c) Show that
Var(X) = σ²µ^(n-1) + µ² Var(X_1)

(d) Use part

(c) to conclude that
Var(Xn) = σ²µ^(n-1) - (µ^(n-1) - 1)
σ²
if µ ≠ 1
no²
if µ = 1
The case described above is known as a branching process, and an important question for a population that evolves along such lines is the probability that the population will eventually die out. Let w denote this probability when the population starts with a single individual. That is,
π = P(population eventually dies out | Xo = 1)

(e) Argue that π satisfies
π = ∑ Pjπ^j j=0
HINT: Condition on the number of offspring of the initial member of the popu-
lation.

• 45. Verify the formula for the moment generating function of a uniform random variable that is given in Table 7.2. Also, differentiate to verify the formulas for the mean and variance.

• 46. For a standard normal random variable Z, let µn = E[Z"]. Show that
µn = (2/)!
2/j!
when n is odd
(2/)!
2j/j!
when n = 2j
HINT: Start by expanding the moment generating function of Z into a Taylor series about 0 to obtain
E[e^z] = e^(1/2)τ²
= ∑ (τ²)^j / j!
j=0
• 47. Let X be a normal random variable with mean µ and variance σ². Use the results of Theoretical Exercise 46 to show that
E[X^n] = ∑ (n/2j) (µ^(n-2j)) (σ²(2j)) / 2j!
j=0 --- OCR End ---

In the equation above, [1/2] is the largest integer less than or equal to n/2.
Check your answer by letting 1 and n = 2.
• 48. If YaX+

b, where a and b are constants, express the moment generating function of Y in terms of the moment generating function of X.
49. The positive random variable X is said to be a lognormal random variable with parameters μ and o if log(X) is a normal random variable with mean µ and variance σ². Use the normal moment generating function to find the mean and variance of a lognormal random variable.
• 50. Let X have moment generating function M(1), and define (r) logM(r).
Show that $$
\Psi'(1) = 0 - Var(X)
$$
• 51. Use Table 7.2 to determine the distribution of $$
\sum_{i=1}^{n} X_i $$, when X₁, ..., X, are independent and identically distributed exponential random variables, each having mean 1/λ.
• 52. Show how to compute Cov(X, Y) from the joint moment generating function of X and Y.
• 53. Suppose that X₁, ..., X, have a multivariate normal distribution. Show that X..... X are independent random variables if and only if $$
Cov(X_i, X_j) = 0 $$ when i j • 54. If Z is a unit normal random variable, what is Cov(Z, 22)?
ELF-TEST PROBLEMS AND EXERCISES • 1. Consider a list of m names, where the same name may appear more than once on the list. Let n(i) denote the number of times that the name in position I appears on the list, i -1, ..., m, and let d denote the number of distinct names on the list.

(a) Express d in terms of the variables m, n, i= 1,..., m.
Let U be a uniform (0, 1) random variable, and let X = [m] + 1.

(b) What is the probability mass function of X?

(c) Argue that E[m/n(X)] = d.
• 2. An urn has a white and m black balls which are removed one at a time in a randomly chosen order. Find the expected number of instances in which a white ball is immediately followed by a black one.
• 3. Twenty individuals, consisting of 10 married couples, are to be seated at five different tables, with four people at each table.

(a) If the seating is done "at random," what is the expected number of married couples that are seated at the same table?

(b) If two men and two women are randomly chosen to be seated at each table, what is the expected number of married couples that are seated at the same table?
• 4. If a die is to be rolled until all sides have appeared at least once, find the expected number of times that outcome 1 appears.
• 5. A deck of 2n cards consists of n red and black cards. These cards are shuffled and then turned over one at a time. Suppose that each time a red card is turned over we win 1 unit if more red cards than black cards have been turned over by that time. (For instance, if n 2 and the result isrbrb, then we would win a total of 2 units.) Find the expected amount that we win.
• 6. Let A1, A2 A, be events, and let N denote the number of them that occur. Also, let 11 if all of these events occur, and let it be 0 otherwise.
Prove Bonferroni's inequality, namely that $$
P(A_1 ...A_n) \ge \sum_{i=1}^{n} P(A_i) - (n - 1)
$$
HINT: Argue first that N-1+1.
• 7. Suppose that k of the balls numbered 1, 2,..., n, where nk, are randomly chosen. Let X denote the maximum numbered ball chosen. Also, let R denote the number of the n k unchosen balls that have higher numbers than all the chosen balls.

(a) What is the relationship between X and R?

(b) Express Ras the sum of nk suitably defined Bernoulli random variables.

(e) Use parts

(a) and

(b) to find E[X].
(Note that E[X] was obtained previously in Theoretical Exercise 28 of Chap-
ter 4.)
• 8. Let X be a Poisson random variable with mean A. Show that if A is not too small, then $$
Var(\sqrt{X}) \approx 25 $$
HINT: Use the result of Theoretical Exercise 4 to approximate E[VX].
• 9. Suppose in Self-Test Problem 3 that the 20 people are to be seated at seven tables, three of which have 4 seats and four of which have 2 seats. If the people are randomly seated, find the expected value of the number of married couples that are seated at the same table.
• 10. Individuals 1 through n, n > 1, are to be recruited into a firm in the following manner. Individual 1 starts the firm and recruits individual 2. Individuals 1 and 2 will then compete to recruit individual 3. Once individual 3 is recruited, individuals 1, 2, and 3 will compete to recruit individual 4, and so on. Suppose that when individuals 1, 2, i compete to recruit individual i + 1, each of them is equally likely to be the successful recruiter.

(a) Find the expected number of the individuals 1,..., that did not recruit anyone else.

(b) Derive an expression for the variance of the number of individuals who did not recruit anyone else and evaluate it for n 5.
--- OCR End ---

• 11. The nine players on a basketball team consist of 2 centers, 3 forwards, and 4 backcourt players. If the players are paired up at random into three groups of size 3 each, find the

(a) expected value and the

(b) variance of the number of triplets consisting of one of each type of player. 

• 12. A deck of 52 cards is shuffled and a bridge hand of 13 cards is dealt out. Let X and Y denote, respectively, the number of aces and the number of spades in the dealt hand.

(a) Show that X and Y are uncorrelated.

(b) Are they independent? 

• 13. Each coin in a bin has a value attached to it. Each time that a coin with value p is flipped it lands on heads with probability p. When a coin is randomly chosen from the bin, its value is uniformly distributed on (0, 1). Suppose that after the coin is chosen but before it is flipped, you must predict whether it will land heads or tails. You will win 1 if you are correct and will lose I otherwise.

(a) What is your expected gain if you are not told the value of the coin?

(b) Suppose now that you are allowed to inspect the coin before it is flipped, with the result of your inspection being that you learn the value of the coin. As a function of p, the value of the coin, what prediction should you make?

(e) Under the conditions of part (b), what is your expected gain? 

• 14. In Self-Test Problem 1 we showed how to use the value of a uniform (0, 1) random variable (commonly called a random number) to obtain the value of a random variable whose mean is equal to the expected number of distinct names on a list. However, its use required that one chooses a random position and then determine the number of times that the name in that position appears on the list. Another approach, which can be more efficient when there is a large amount of name replication, is as follows. As before, start by choosing the random variable X as in Problem 3. Now identify the name in position X, and then go through the list starting at the beginning until that name appears. Let / equal 0 if you encounter that name before getting to position X, and let I equal 1 if your first encounter with the name is at position X. Show that Elmd.

• 1. Suppose that X is a random variable with mean and variance both equal to 20. What can be said about P[0 X= 40)? 

• 2. From past experience a professor knows that the test score of a student taking her final examination is a random variable with mean 75.

(a) Give an upper bound for the probability that a student's test score will exceed 85. Suppose, in addition, the professor knows that the variance of a student's test score is equal to 25.

(b)
What can be said about the probability that a student will score between 65 and 85?
(e)
How many students would have to take the examination to ensure, with probability at least 9, that the class average would be within 5 of 75?
Do not use the central limit theorem.
• 3. Use the central limit theorem to solve part

(c) of Problem 2.
• 4. Let X1, X20 be independent Poisson random variables with mean 1.

(a) Use the Markov inequality to obtain a bound on $$
P[\sum_{i=1}^{20} X_i > 15]
$$

(b) Use the central limit theorem to approximate $$
P[\sum_{i=1}^{20} X_i > 15]
$$
• 5. Fifty numbers are rounded off to the nearest integer and then summed. If the individual round-off errors are uniformly distributed over (-5, 5) what is the probability that the resultant sum differs from the exact sum by more than 3?
• 6. A die is continually rolled until the total sum of all rolls exceeds 300. What is the probability that at least 80 rolls are necessary?
• 7. One has 100 light bulbs whose lifetimes are independent exponentials with mean 5 hours. If the bulbs are used one at a time, with a failed bulb being replaced immediately by a new one, what is the probability that there is still a working bulb after 525 hours?
• 8. In Problem 7 suppose that it takes a random time, uniformly distributed over (0, 5), to replace a failed bulb. What is the probability that all bulbs have failed by time 550?
• 9. If X is a gamma random variable with parameters (n, 1) how large need n be so that $$
P[|X - n| > .01] < .01?
$$
• 10. Civil engineers believe that W, the amount of weight (in units of 1000 pounds)
that a certain span of a bridge can withstand without structural damage resulting, is normally distributed with mean 400 and standard deviation 40.
Suppose that the weight (again, in units of 1000 pounds) of a car is a random variable with mean 3 and standard deviation 3. How many cars would have to be on the bridge span for the probability of structural damage to exceed .1?
• 11. Many people believe that the daily change of price of a company's stock on the stock market is a random variable with mean 0 and variance σ². That is, if Yt represents the price of the stock on the nth day, then Yn = Yn-1 + Xn     n ≥ 1 where X1, X2, ... are independent and identically distributed random variables with mean 0 and variance σ². Suppose that the stock's price today is 100.
If σ² = 1, what can you say about the probability that the stock's price will exceed 105 after 10 days?
• 12. We have 100 components that we will put in use in a sequential fashion.
That is, component 1 is initially put in use, and upon failure it is replaced by component 2, which is itself replaced upon failure by component 3, and so on. If the lifetime of component i is exponentially distributed with mean 10 + 1/10, i = 1,..., 100, estimate the probability that the total life of all components will exceed 1200. Now repeat when the life distribution of component i is uniformly distributed over (0, 20 + 1/5), i = 1,..., 100.
• 13. Student scores on exams given by a certain instructor have mean 74 and standard deviation 14. This instructor is about to give two exams, one to a class of size 25 and the other to a class of size 64.

(a) Approximate the probability that the average test score in the class of size 25 exceeds 80.

(b) Repeat part

(a) for the class of size 64.

(c) Approximate the probability that the average test score in the larger class exceeds that of the other class by over 2.2 points.

(d) Approximate the probability that the average test score in the smaller class exceeds that of the other class by over 2.2 points.
• 14. A certain component is critical to the operation of an electrical system and must be replaced immediately upon failure. If the mean lifetime of this type of component is 100 hours and its standard deviation is 30 hours, how many of these components must be in stock so that the probability that the system is in continual operation for the next 2000 hours is at least .95?
• 15. An insurance company has 10,000 automobile policyholders. The expected yearly claim per policyholder is $240 with a standard deviation of $800.
Approximate the probability that the total yearly claim exceeds $2.7 million.
• 16. Redo Example 5b under the assumption that the number of man-woman pairs is (approximately) normally distributed. Does this seem like a reasonable supposition?
• 17. Repeat part

(a) of Problem 2 when it is known that the variance of a student's test score is equal to 25.
• 18. A lake contains 4 distinct types of fish. Suppose that each fish caught is equally likely to be any one of these types. Let Y denote the number of fish that need be caught to obtain at least one of each type.

(a) Give an interval

(a,

b) such that P(a ≤ Y ≤

b) ≥ 90.

(b) Using the one-sided Chebyshev inequality, how many fish need we plan on catching so as to be at least 90 percent certain of obtaining at least one of each type?
• 19. If X is a nonnegative random variable with mean 25, what can be said about:

(a) E[X³];

(b) E[√X];

(c) E[log X];

(d) E[e-X]?

• 20. Let X be a nonnegative random variable. Prove that E[X] (E[X])=(E[X])=...

• 21. Would the results of Example 5f change if the investor were allowed to divide her money and invest the fraction

a, 0 < a <1 in the risky proposition and invest the remainder in the risk-free venture? Her return for such a split investment would be RaX + (a)m. 

• 22. Let X be a Poisson random variable with mean 20.

(a) Use the Markov inequality to obtain an upper bound on p = P(X26]

(b) Use the one-sided Chebyshev inequality to obtain an upper bound on p.

(e) Use the Chernoff bound to obtain an upper bound on p.

(d) Approximate p by making use of the central limit theorem.

(e) Determine p by running an appropriate program.

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: