All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
theory of probability
Questions and Answers of
Theory Of Probability
6. Let A1, A2 A, be events, and let N denote the number of them that occur. Also, let 11 if all of these events occur, and let it be 0 otherwise.Prove Bonferroni's inequality, namely that$$P(A_1
5. A deck of 2n cards consists of n red and black cards. These cards are shuffled and then turned over one at a time. Suppose that each time a red card is turned over we win 1 unit if more red cards
4. If a die is to be rolled until all sides have appeared at least once, find the expected number of times that outcome 1 appears.
3. Twenty individuals, consisting of 10 married couples, are to be seated at five different tables, with four people at each table.(a) If the seating is done "at random," what is the expected number
2. An urn has a white and m black balls which are removed one at a time in a randomly chosen order. Find the expected number of instances in which a white ball is immediately followed by a black one.
1. Consider a list of m names, where the same name may appear more than once on the list. Let n(i) denote the number of times that the name in position I appears on the list, i -1,..., m, and let d
54. If Z is a unit normal random variable, what is Cov(Z, 22)?ELF-TEST PROBLEMS AND EXERCISES
53. Suppose that X₁,..., X, have a multivariate normal distribution. Show that X..... X are independent random variables if and only if$$Cov(X_i, X_j) = 0$$ when i j
52. Show how to compute Cov(X, Y) from the joint moment generating function of X and Y.
51. Use Table 7.2 to determine the distribution of$$\sum_{i=1}^{n} X_i$$, when X₁,..., X, are independent and identically distributed exponential random variables, each having mean 1/λ.
50. Let X have moment generating function M(1), and define (r) logM(r).Show that$$\Psi'(1) = 0 - Var(X)$$
48. If YaX+b, where a and b are constants, express the moment generating function of Y in terms of the moment generating function of X.49. The positive random variable X is said to be a lognormal
47. Let X be a normal random variable with mean µ and variance σ². Use the results of Theoretical Exercise 46 to show that E[X^n] = ∑ (n/2j) (µ^(n-2j)) (σ²(2j)) / 2j!j=0--- OCR End ---In the
46. For a standard normal random variable Z, let µn = E[Z"]. Show thatµn =(2/)!2/j!when n is odd(2/)!2j/j!when n = 2j HINT: Start by expanding the moment generating function of Z into a Taylor
45. Verify the formula for the moment generating function of a uniform random variable that is given in Table 7.2. Also, differentiate to verify the formulas for the mean and variance.
44. Consider a population consisting of individuals able to produce offspring of the same kind. Suppose that each individual will, by the end of its lifetime, have produced j new offspring with
43. For random variables X and Z show that E[(X - Y)²] = E[X²] - E[Y²]where Y = E[XZ]
42. It follows from Proposition 5.1 and the fact that the best linear predictor of Y with respect to X is µy + p σx (x - µx) that if E[Y|X] = a + bX then a = µy - p σx µx b = p σx σy(Why?)
41. Let X be a normal random variable with parameters µ = 0 and σ² = 1 and let I, independent of X, be such that P{I = 1} = P{I = 0}. Now define Y by X if I = 1 Y =-X if I = 0 In words, Y is
40. ???? and ???? are jointly normally distributed with joint density function given by 1????(????, ????) =2????σₓσᵧ√(1 − ????²)× ????????????− 1/2 (1 − ????²)[(???? −
39. The best quadratic predictor of ???? with respect to ???? is ???? + ???????? + ????????², where????, ????, and ???? are chosen to minimize ????[(???? − (???? + ???????? + ????????²))²].
38. The best linear predictor of ???? with respect to ????₁ and ????₂ is equal to ???? +????????₁ + ????????₂, where ????, ????, and ???? are chosen to minimize Determine ????, ????, and
37. An urn contains ???? white and ???? black balls. After a ball is drawn, it is returned to the urn if it is white; but if it is black, it is replaced by a white ball from another urn. Let ????ₙ
36. One ball at a time is randomly selected from an urn containing ???? white and???? black balls until all of the remaining balls are of the same color. Let ????ₐ,ₙdenote the expected number of
35. (a) Prove that????[????] = ????[????|???? < ????]????(???? < ????) + ????[????|???? = ????]????(???? = ????)HINT: Define an appropriate random variable and then compute ????[????] by conditioning
34. For another approach to Theoretical Exercise 33, let ????ᵣ denote the number of flips required to obtain a run of ???? consecutive heads.(a) Determine ????[????₁|????₁₋₁].(b) Determine
33. A coin, which lands on heads with probability ????, is continually flipped.Compute the expected number of flips that are made until a string of ???? heads in a row is obtained.HINT: Condition on
32. Prove Equation (6.1b).
31. An urn initially contains ???? black and ???? white balls. At each stage we add ????black balls and then withdraw, at random, ???? from the ???? + ???? + ????. Show that????[number of white balls
30. Consider Example 3g, which is concerned with the multinomial distribution.Use conditional expectation to compute ????[????₁????₂] and then use this to verify the formula for
29. Let ????₁, … , ????ₙ be independent and identically distributed random vari-ables. Find????[????₁ + … + ????ₙ = ????]
28. Show ????????????(????, ????[????|????]) = ????????????(????, ????).
27. Prove that if ????[????|???? = ????] = ????[????] for all ????, then X and Y are uncorrelated, and give a counterexample to show that the converse is not true.HINT: Prove and use the fact that
26. Prove that ????[????(????)????|????] = ????(????)????[????|????].
25. Show that if X and Y are independent, then????[????????|????] = ????????[????]for all ????(a) in the discrete case;(b) in the continuous case.
24. Prove the Cauchy-Schwarz inequality, namely, that$$(E[XY])^2 ≤ E[X^2]E[Y^2]$$HINT: Unless Y = -X for some constant, in which case this inequality holds with equality, if follows that for all
23. If Z is a unit normal random variable and if Y is defined by Y = a +bZ + cZ2, show that$$p(Y,Z) = \frac{b}{\sqrt{b^2 + 2c^2}}$$
22. If Y = a + bX, show that$$p(X,Y) =\begin{cases}+1 & \text{if } b > 0 \\-1 & \text{if } b < 0\end{cases}$$
21. Let X(i), i = 1, ..., n, denote the order statistics from a set of n uniform(0, 1) random variables and note that the density function of X(i) is given by f(x)
20. The Conditional Covariance Formula. The conditional covariance of X and Y, given Z, is defined by Cov(X, Y|Z) = E[(X − E[X|Z])(Y − E[Y|Z])|Z](a) Show that Cov(X, Y|Z) = E[XYZ] −
19. If X and Y are identically distributed, not necessarily independent, show that Cov(X + Y, XY) = 0
18. In Example 3g we showed that the covariance of the multinomial random variables N1 and N2 is equal to mP1P2 by expressing N1 and N2 as the sum of indicator variables. This result could also have
17. Suppose that X1 and X2 are independent random variables having a common mean μ. Suppose also that Var(X1) = σ12 and Var(X2) = σ22. The value ofμ is unknown and it is proposed to estimate μ
16. Suppose that balls are randomly removed from an urn initially containing n white and m black balls. It was shown in Example 2m that E[X] =1 + m/(n + 1), when X is the number of draws needed to
15. Consider n independent trials, the ith of which results in a success with probability Pi.(a) Compute the expected number of successes in the n trials—call it μ.(b) For fixed value of μ, what
14. For Example 2j show that the variance of the number of coupons needed to amass a full set is equal to$$\sum_{j=1}^{N-1} \frac{jN}{(N-j)^2}$$When N is large, this can be shown to be approximately
13. Let X1, X2, ..., Xn be independent and identically distributed continuous random variables. We say that a record value occurs at time j, j ≤ n, if Xj ≥ Xi for all 1 ≤ i ≤ j. Show that(a)
12. Independent trials are performed. If the ith such trial results in a success with probability Pi, compute (a) the expected number, and (b) the variance, of the number of successes that occur in
11. Consider n independent trials each resulting in any one of r possible outcomes with probabilities P1, P2, ..., Pr. Let X denote the number of outcomes that never occur in any of the trials. Find
10. Let X1, X2, ..., Xn be independent and identically distributed positive random variables. Find, for k ≤ n,$$E\left[ \sum_{i=1}^k X_i \right]$$$$\sum_{i=1}^k E[X_i]$$
9. A coin having probability p of landing heads is flipped a ti es. Compute the expected number of runs of heads of size 1, of size 2, of: ek, 1≤ k ≤ n.--- OCR End ---
8. Show that X is stochastically larger than Y if and only if$$E[f(X)] = E[/(Y)]$$for all increasing functions f.HINT: If X, Y, show that E[f(X)] E[/(Y)] by showing that f(X)st f(Y) and then using
7. We say that X is stochastically larger than Y, written X Y, if for all 1,$$P(X> r) \ge P(Y > 1)$$Show that if X Y, then E[X] [Y] when(a) X and Y are nonnegative random variables;(b) X and Y are
6. In the text we noted that$$E[\sum_{i=1}^{n} X_i] = \sum_{i=1}^{n} E[X_i]$$when the X, are all nonnegative random variables. Since an integral is a limit of sums, one might expect
5. Let A1, A2... A, be arbitrary events, and define C (at least k of the A, occur). Show that$$\sum_{k=1}^{n} P(C_k) = \sum_{i=1}^{n} P(A_i)$$HINT: Let X denote the number of the A, that occur. Show
4. Let X be a random variable having finite expectation µ and variance σ², and let g(-) be a twice differentiable function. Show that$$E[g(X)] = g(µ) + \frac{g''(µ)}{2} σ²$$HINT: Expand g(-)
3. Prove Proposition 2.1 when(a) X and Y have a joint probability mass function;(b) X and Y have a joint probability density function and g(x, y) = 0 for all x, y.
2. Suppose that X is a continuous random variable with density functionf. Show that E[Xa] is minimized when a is equal to the median of F.HINT: Write$$E[X-a] = \int_{}^{} (x-a) f(x) dx$$Now break up
1. Show that E[(Xa)²] is minimized at a = E[X].
74. Two envelopes, each containing a check, are placed in front of you. You are to choose one of the envelopes, open it, and see the amount of the check. At this point you can either accept that
73. The joint density of X and Y is given by$$f(x, y) = \frac{1}{\sqrt{2\pi}}e^{-y/2}e^{-(x-y)^2/2}, $$0 < y < ∞, -∞ < x < ∞(a) Compute the joint moment generating function of X and Y.(b)
72. Let X be the value of the first die and Y the sum of the values when two dice are rolled. Compute the joint moment generating function of X and Y.
71. The moment generating function of X is given by $$M_x(t) = exp(2e^{t^2} - 2)$$and that of Y by $$M_y(t)$$. If X and Y are independent, what are(a) P(X + Y = 2);(b) P(XY = 0);(c) E[XY]?
70. In Example 5c, suppose that X is uniformly distributed over (0, 1). If the discretized regions are determined by $$a_0 = 0, a_1 = b_1 = a_2 = 1$$, determine the optimal quantizer Y and compute
69. In Example 5b let S denote the signal sent and R the signal received.(a) Compute E[R].(b) Compute Var(R).(c) Is R normally distributed?(d) Compute Cov(R, S).
68. Suppose that in Problem 66 we continue to flip the coin until a head appears.Let N denote the number of flips needed. Find(a) P(N = i), i ≥ 0;(b) P(N);(c) E[N].
67. In Problem 66, suppose that the coin is tossed a times. Let X denote the number of heads that occur. Show that$$P(X = i) = \frac{n!}{i!(n-i)!}p^i(1-p)^{(n-i)},$$i = 0, 1, ..., n HINT: Make use of
66. Consider an urn containing a large number of coins and suppose that each of the coins has some probability p of turning up heads when it is flipped.However, this value of p varies from coin to
65. Repeat Problem 64 when the proportion of the population having a value of A less than x is equal to 1 ex
64. The number of accidents that a person has in a given year is a Poisson random variable with mean A. However, suppose that the value of A changes from person to person, being equal to 2 for 60
63. Consider a gambler who at each gamble either wins or loses her bet with probabilities p and 1 p. When p >, a popular gambling system, known as the Kelley strategy, is to always bet the fraction
62. The dice game of craps was defined in Problem 26 of Chapter 2. Compute(a) the mean and (b) the variance of the number of rolls of the dice that it takes to complete one game of craps.
60. Type i light bulbs function for a random amount of time having mean µ, and standard deviation σι, i 1, 2. A light bulb randomly chosen from a bin of bulbs is a type 1 bulb with probability p,
59. An urn contains 30 balls, of which 10 are red and 8 are blue. From this urn, 12 balls are randomly withdrawn. Let *X* denote the number of red, and *Y* the number of blue, balls that are
58. Let *U*1, *U*2,... be a sequence of independent uniform (0, 1) random variables.In Example 4h we showed that for 0 ≤ *x* ≤ 1, *E[N(x)]* = *ex*, where$$N(x) = min{n: \sum_{i=1}^{n}U_{i} >
57. Each of *m* + 2 players pays 1 unit to a kitty in order to play the following game. A fair coin is to be flipped successively *n* times, where *n* is an odd number, and the successive outcomes
56. There are *n* + 1 participants in a game. Each person, independently, is a winner with probability *p*. The winners share a total prize of 1 unit. (For instance, if 4 people win, then each of
55. A person continually flips a coin until a run of 3 consecutive heads appears.Assuming that each flip independently lands heads with probability *p*, determine the expected number of flips
54. A coin having probability *p* of coming up heads is continually flipped until both heads and tails have appeared. Find(a) the expected number of flips;(b) the probability that the last flip lands
53. Suppose that the expected number of accidents per week at an industrial plant is 5. Suppose also that the numbers of workers injured in each accident are independent random variables with a
52. The number of people that enter an elevator on the ground floor is a Poisson random variable with mean 10. If there are N floors above the ground floor and if each person is equally likely to get
51. Ten hunters are waiting for ducks to fly by. When a flock of ducks flies overhead, the hunters fire at the same time, but each chooses his target at random, independently of the others. If each
50. Consider the following dice game. A pair of dice are rolled. If the sum is 7, then the game ends and you win 0. If the sum is not 7, then you have the option of either stopping the game and
49. A prisoner is trapped in a cell containing 3 doors. The first door leads to a tunnel that returns him to his cell after 2 days travel. The second leads to a tunnel that returns him to his cell
48. A population is made up of r disjoint subgroups. Let p, denote the propor-tion of the population that is in subgroup i, i 1,..., r. If the average weight of the members of subgroup i is w, i
47. The joint density of X and Y is given by f(x, y) =0
46. The joint density of X and Y is given by e-xlye-y f(x, y) -y Compute E[X2Y = y].0
45. An urn contains 4 white and 6 black balls. Two successive random samples of sizes 3 and 5, respectively, are drawn from the urn without replacement.Let X and Y denote the number of white balls in
44. A fair die is successively rolled. Let X and Y denote, respectively, the number of rolls necessary to obtain a 6 and a 5. Find(a) E[X];(b) E[X/Y = 1];(c) E[XY = 5].
43. Consider a graph having a vertices labeled 1, 2, n, and suppose that between each of the pairs of distinct vertices an edge is, independently, present with probability p. The degree of vertex i,
42. Consider the following dice game, as played at a certain gambling casino:Players 1 and 2 roll in turn a a pair of dice. The bank then rolls the dice determine the outcome according to the
41. If X1, X2, X3, X4 are (pairwise) uncorrelated random variables each having mean O and variance 1, compute the correlations of(a) X₁ + X₂ and X2 + X3;(b) X₁ + X₂ and X3 + X4.to
40. There are two distinct methods for manufacturing certain goods, the quality of goods produced by method i being a continuous random variable having distribution F,, i = 1, 2. Suppose that a goods
39. Let X1, X2, X be independent random variables having an unknown continuous distribution function F, and let Y1, Y2, Y be independent random variables having an unknown continuous distribution
38. A group of 20 people-consisting of 10 men and 10 women are randomly arranged into 10 pairs of 2 each. Compute the expectation and variance of the number of pairs that consist of a man and a
37. A pond contains 100 fish, of which 30 are carp. If 20 fish are caught, what are the mean and variance of the number of carp among these 20? What assumptions are you making?
36. The joint density function of X and Y is given by$$f(x,y)=\frac{1}{y}e^{-(x+y)/y}, x>0, y> 0$$Find E[X], E[Y], and show that Cov(X, Y) 1.
35. Let X1, be independent with common mean µ and common variance o², and set Y = X + X+1+X+2. For j≥ 0, find Cov(Υπ. Υπ+j).
34. The random variables X and Y have a joint density function given by$$f(x,y)=\begin{cases}2e^{-2x/x} & 0\leq x
33. A die is rolled twice. Let X equal the sum of the outcomes, and let Y equal the first outcome minus the second. Compute Cov(X, Y).
Showing 3400 - 3500
of 6259
First
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
Last