All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
statistics for business and economics
Questions and Answers of
Statistics For Business And Economics
5-51. Demonstrate that if (X,Y) is a bivariate random variable with σX and σY finite, then −1 ≤ ρXY ≤ 1 . (Hint: (a) For X, Y continuous, apply Schwarz’s Inequality for two variables; and
5-50. Verify that for (X,Y) a bivariate random variable, COV(X,Y) = E(XY)−E(X)E(Y).
5-49. Verify that for (X,Y) a bivariate random variable, V(Y) = E [V (Y |X )]+V [E (Y |X )] . (Hint: Begin with the expression E [V(Y |X )] =E 9E(Y2 |X ) − [E(Y |X )]2: .)
5-48. For (X,Y) a bivariate random variable, suppose X and Y are independent random variables whose individual expectations exist. Then COV(X,Y) = 0. Verify this result.
5-47. For the bivariate random variable (X,Y) having the probability density function f (x, y) = -x−1, 0< x < 1 and 0 < y < x;0 elsewhere, find the marginal probability density functions g(x) and
5-46. For (X,Y) a bivariate random variable, suppose that V(X) and V(Y) both exist. Then COV(X,Y) exists. Comment on this assertion.
5-45. For (X,Y) a bivariate random variable, suppose X and Y are independent random variables whose individual expectations exist. Then E(XY) exists and E(XY) = E(X) · E(Y). Verify this result for
5-44. For (X,Y) a bivariate random variable, suppose that V(X) and V(Y)exist. Then V(aX + bY) exists and V(aX + bY) = a2V(X) + b2V(Y) +2abCOV(X,Y). Verify this result for X,Y continuous random
5-43. Verify that for the bivariate random variable (X,Y), E [α(X) + β(Y)] =Eα(X) + Eβ(Y) . Consider both the discrete and continuous cases.
5-42. Verify that for X,Y independent random variables, V(X + Y) = V(X) +V(Y). (Hint: Use V(X +Y) = E 3(X + Y)22−3 E(X + Y2 2 , with E(X +Y) = E(X) + E(Y) and E(XY) = E(X) · E(Y) .)
5-41. Suppose the bivariate random variable (X,Y) has a probability density function of the form f (x, y) = -2, 0 < x < 1 and x < y < 1;0 elsewhere.Find:(a) E(X)(b) E(Y)(c) V(X)(d) V(Y)
5-40. For the bivariate random variable (X,Y), demonstrate that ρXY = 0 when the probability density function is of the form f (x, y) = -e−x−y, x > 0 and y > 0;0 elsewhere.
5-39. Suppose (X,Y) is a bivariate random variable and X and Y are discrete(or continuous). If X and Y are independent random variables, which of the following is a valid equality?(a) E XY = E(X) ·
5-38. What is the justification for using equation (5.20) for finding P(a < X ≤ b, c < Y ≤ d)? Can (5.20) simply be replaced by F(b,d) − F(a, c)?
5-37. Suppose a bivariate probability density function has the form f (x, y) = -2 − x − y, 0< x < 1 and 0 < y < 1;0 elsewhere.Determine:(a) The marginal probability density functions of the
5-36. Suppose that (X,Y) is a bivariate random variable with probability density function f (x, y) = -k(x + y), 0 ≤ x ≤ 1 and 0 ≤ y ≤ 1;0 elsewhere.Find:(a) The constant k(b) The bivariate
5-35. Suppose a bivariate random variable (X,Y) has a probability density function of the form f (x, y) = -e−x−y, x > 0 and y > 0;0 elsewhere.Find P(X + Y > 10).
5-34. Suppose α ≤ X ≤ β and α ≤ Y ≤ β. What is the form of the uniform probability density function for the bivariate random variable (X,Y)?
5-33. Let X,Y be independent random variables with probability mass functions f (X), g(Y), respectively (or probability density functions f (x), g(y), respectively). Let Z = X + Y with probability
5-32. What is the value of the k that will make f (x, y) = -kxy−1, 0< x < 1 and 1 < y < 2;0 elsewhere a valid probability density function?
5-31. For the joint probability density function f (x, y) = -x + y, 0< x < 1 and 0 < y < 1;0 elsewhere find ρxy.
5-30. Are the random variables X and Y with probability density function f (x, y) = -12(xy − xy2), 0 < x < 1 and 1 < y < 1;0 elsewhere independent?
5-29. Suppose the bivariate cumulative distribution function for the continuous random variables X and Y appears as F(t, s) = -0, t < 0 and s < 0;1 − e−2t − e−2s + e−2(t+s), t ≥ 0 and s
5-28. Suppose that the random variables X and Y have the following joint probability density function f (x, y) = -2, 0 < x < y < 1;0 elsewhere.Find:(a) The marginal distributions of X and Y(b) The
5-27. Suppose f (x, y) is a bivariate probability density function and we want to find P(a < X
5-26. Can the expression f (x, y) = -xe−x−xy, x > 0 and y > 0;0 elsewhere serve as a joint probability density function?
5-25. Given the bivariate cumulative distribution function F(t, s), the joint probability P(a < X ≤b, Y ≤d) = F(b,d) − F(a, d). Using this expression, find P 13< X ≤ 1,Y ≤ 2 3 when f (x,
5-24. If F(t, s) = ts(t + s)/2, 0 ≤ x ≤ 1 and 0 ≤ y ≤ 1, find the joint probability density function f (t, s).
5-23. Find the value of k that makes f (x, y) = -k xy , 0< x < 2 and 1 < y < 3;0 elsewhere a bivariate probability density function.
5-22. Given the joint probability density function f (x, y) = -k(x + 2y), 0 < x ≤ 2 and 0 < y ≤ 1;0 elsewhere, find:(a) The value of k(b) The marginal distribution of X(c) The cumulative
5-21. Let X,Y be independent random variables with respective probability density functions f (x) = -1 2 e−x/10 x > 0;0 elsewhere g(y) = -1 2 e−y/10 y > 0;0 elsewhere.If Z = X + Y, find h(z), the
5-20. Let X1,X2, and X3 be random variables, where:E(X1) = 1 V(X1) = 1 COV(X1,X2) = 2 E(X2) = −1 V(X2) = 5 COV(X1,X3) = −1 E(X3) = 3 V(X3) = 2 COV(X2,X3) = 0.5 For U = X1 − 2X2 + 3X3, find:(a)
5-19. Given the bivariate probability density function specified in the preceding exercise, define event A = {(X,Y) |X + Y ≤ 1} .Find P(A).
5-18. Given the bivariate probability density function for the continuous random variables X and Y, f (x, y) = -λ, 0< x < 1 and 0 < y < 2 0 elsewhere.Find:(a) λ(b) P 0 < X < 1 2 , 1 2 < Y < 3 4
5-17. For the bivariate probability density function appearing in Exercise 5–13, find:(a) E(X) and E(Y)(b) E(XY)(c) V(X) and V(Y)(d) COV(X,Y)
5-16. Suppose a bivariate random variable (X,Y) has a probability density function of the form f (x, y) = -θ2e−θ(x+y), θ > 0, x ≥ 0, y ≥ 0;0 elsewhere.Find P(0 ≤ X ≤ 100, 0 ≤ Y ≤
5-15. Given the bivariate probability density function f (x, y) = -2y, 0< x ≤ 1 and 0 < y ≤ 1;0 elsewhere, find the marginal density functions for X and Y. Are X and Y independent random
5-14. For random variables X and Y, find the value of k that makes f (x, y) = -kxy2, 0≤ x ≤ 1 and 0 ≤ y ≤ 1;0 elsewhere, a bivariate probability density function. Then find:(a) The cumulative
5-13. Given the bivariate probability density function f (x, y) = -3x(1 − xy), 0 < x < 1 and 0 < y < 1;0 elsewhere, find:(a) P X ≤ 1 2 , Y ≤ 1 2 (b) the marginal densities of X and Y(c) P 14 <
5-12. Comment on the following statement: Knowledge of the marginal probability mass functions g(X) and h(Y) is generally not equivalent to knowledge of the bivariate probability mass function f
5-11. Suppose a fair pair of six-sided dice is tossed and (X,Y) is a bivariate random variable defined on the sample space S = {Ei, i = 1, . . . , 36} . Here each simple eventEi is a point (X = j,Y =
5-10. Given the following bivariate probability mass function for the discrete random variables X and Y, find E(Z), where Z = X − XY + 2Y.Y 0 1 2 3 X0 0.05 0.30 0.20 0.05 1 0.05 0.10 0.20 0.05
5-9. ForX,Y discrete random variables with bivariate probability mass function f (X,Y) , demonstrate that the marginal probability mass functions have the form:g(X) =j f (X,Yj); h(Y) =i f (Xi,Y).
5-8. A jar contains three balls numbered 1,2, and 3, respectively. Two balls are drawn at random with replacement. Let X be the number of the ball on the first draw and let Y be the number on the
5-7. Let the bivariate probability mass function for the discrete random variables X and Y appear as f (X,Y) = -19, X = 1, 2, 3 and Y = 1, 2, 3;0 elsewhere.Are X and Y independent?
5-6. Let the bivariate probability mass function of the discrete random variables X and Y be given as f (X,Y) = -X+Y 36 , X = 1, 2, 3 and Y = 1, 2, 3;0 elsewhere.Find:(a) P(X = 2, Y ≤ 2)(b) F(2,
5-5. Let a random experiment consist of rolling a fair pair of six-sided dice.Let the random variable X = |difference of the faces| and let the random variable Y = sum of the faces. Determine the
5-4. Given the following bivariate probability distribution, find:Y −1 2 3 X1 114 114 214 22 14 314 114 31 14 03 14(a) E(X) and E(Y)(b) V(X) and V(Y)(c) ρXY(d) E(X|Y = 2) and E(Y|X = 3)(e) V(X|Y =
5-3. Let a random experiment consist of rolling a fair pair of six-sided dice. Let the random variable X (respectively, Y) be defined as the face value of die 1 (respectively, die 2). Determine the
5-2. Let a random experiment consist of tossing a fair coin twice. Define the random variable X to be the number of heads obtained in the two tosses and let the random variable Y be the opposite face
5-1. Given the following bivariate probability distribution, determine the sets of marginal probabilities for the X and Y random variables. Find:Y 0 1 2 3 X0 387 387 587 587 14 87 687 887 10 87 27 87
4-67. Consider the problem of obtaining the probability distribution of a random variable Y from information about the probability distribution of a random variable X, where y = g(x) is a functional
4-66. Suppose X is a continuous random variable with mean μ and standard deviation σ. Let Z = (X − μ)/σ. Demonstrate that:(a) E(Z) = 0(b) V(Z) = 1
4-65. For the probability mass function utilized in Example 4.8.1, determineφX(et ). What is its interpretation? What restriction is placed on t? UseφX(et) to determine the mean of X.
4-64. For each of the following moment-generating functions, find the associated probability density function:(a) mX(t) = e7t−e3t 4t , t = 0(b) mX(t) = (1 − 0.8t)−1, t ≤ 1.25(c) mX(t) = 2
4-63. For a continuous random variable X with probability density function f (x) = -10e−10x, x > 0;0 elsewhere, find X’s moment generating function. Verify that μ = σ = 1 10 . Then find the
4-62. For the discrete random variable X defined in the preceding exercise, find X’s probability generating function. Then determine φX(0), φ(1)X (0), φ(2)X (0)along with φ(1)X (1) and φ(2)X
4-61. Suppose a fair coin is tossed twice. Let the random variable X depict the number of heads obtained. Determine the probability mass function for X. Also, find the moment-generating function for
4-60. For X a continuous random variable with probability density function f (x) = -13, −1 < x < 2;0 elsewhere, demonstrate that mX(t) = e2t − e−t 3t, t = 0.
4-59. Let X be a discrete random variable with probability mass function f (X) = -p(1 − p)X−1, X = 1, 2, . . . ;0 elsewhere.Determine mX(t).
4-58. Let the continuous random variable X be uniformly distributed with probability density function f (x) = - 1β−α x, α < x < β;0 elsewhere.Show that the moment-generating function for X is
4-57. Suppose the continuous random variable X has a probability density function of the exponential variety or f (x) = -1λ e−x/λ, x ≥ 0, λ > 0;0 elsewhere.Verify that the moment-generating
4-56. Let the continuous random variable X have a probability density function of the form f (x) = -100e−100x, x > 0;0 elsewhere.Find mX(t). What is the restriction on t? Also determine m(1)X (0),
4-55. SupposeXis a continuous random variable with probability density function f (x) = -1 2x, 0< x < 2;0 elsewhere.Find mX(t),μ1,μ2, and σ2.
4-54. Let X be a discrete random variable with probability mass function f (X).If a random variable Y = a+bX, b = 0, then the probability mass function of Y is g(y) = f y − a b.Verify this
4-53. For X a continuous random variable, use the transformation employed in the preceding problem to verify that E(cX) = cE(X), c a constant. (Hint:Let f (y) be the probability density function for
4-52. Let X be a continuous random variable with probability mass function f (x).If a random variable Y = a + bX, b = 0, then the probability density function of Y is g(y) = 1|b| f y − a
4-51. Suppose a continuous random variable X has a probability density function of the form f (x) = -15 20 < x < 25;0 elsewhere.Find μr . Then determine μ1,μ2,μ3, and μ4. Also find μ2,μ3,
4-50. If a random variable X has the probability density function f (x) = -4x3, 0< x < 1;0 elsewhere, find its median. Also determine γ0.3.
4-49. Given the probability density function f (x) = -1 2x2e−x, 0< x < +∞;0 elsewhere, for the random variable X, find the mode of X.
4-48. Given that the random variable X has the probability density function f (x) = -15, 5< x < 10;0 elsewhere find:(a) F(t)(b) E(X)(c) V(X)(d) the median(e) the interquartile range
4-47. For the probability density function f (x) = - 1 10 e−x/10, x > 0;0 elsewhere find:(a) E(X)(b) V(X)(c) μ3(d) μ3(e) α3(f) α4(g) γ0.50(h) γ0.75
4-46. For the discrete probability distribution appearing in Exercise 4-25, find:(a) μ3 (b) μ3 (c) α3 (d) α4 (e) γ0.5 (f) γ0.3
4-45. Let the probability mass function for a discrete random variable X be f (X) =⎧⎪⎨⎪⎩0.3 for X = 0;0.7 for X = 1;0 elsewhere.Find μr , r = 1, 2, . . . .
4-44. Suppose that the probability density function for a continuous random variable X is f (x) = -1 2x−1/2 0 < x < 1;0 elsewhere.Does X have a symmetrical probability density function? (Hint:
4-43. Given that a continuous random variable X has a cumulative distribution function of the form F(t) =⎧⎪⎨⎪⎩0 t < 0;t0.5 0 ≤ t ≤ 1;1 t> 1, find the quartiles of X. That is, find:(a)
4-42. If X is a random variable with E(X) = 3 and V(X) = 4, use Chebyshev’s Theorem to determine a lower-bound for P(−2 < X < 8).
4-41. Over the long run the daily price/barrel (denoted X) of a certain grade of crude oil has averaged about $17.23 with a standard deviation of $1.26.Given that the probability distribution of X is
4-40. Let X be a continuous random variable whose mean E(X) = μ and varianceV(X) = σ2 exist. For anyε > 0 and small, verify that the (4.19) version of Chebyshev’s Theorem holds.(Hint: start with
4-39. Use the preceding result to derive the (4.19.2) form of Chebyshev’s Theorem.
4-38. Let X be a random variable such that P(X < 0) = 0. If E(X) = α ≥ 0 exists, then, for t ≥ 1, P(X
4-37. Verify that if X is a discrete or continuous random variable whose variance exists, then, for a and b constants:(a) V(a) = 0(b) v(a + X) = V(X)(c) V(a + bX) = b2V(X)(d) V(X) = E(X2) − E(X)2
4-36. Verify that if X is a discrete or continuous random variable whose expectation exists, then, for a and b constants:(a) E(a) = a(b) E(a ± bX) = a ± bE(X)For finite k, demonstrate that:(c) E
4-35. The probability density function for a continuous random variable X is f (x) = -15 5 < x < 10;0 elsewhere.Find E(X) and E(X2). What is V(X)?
4-34. Given the probability mass function X f(X)1 1/4 3 1/2 9 —1/4–1 let g(X) = X2 − 1. Find E g(X) .
4-33. Suppose that the lifetime (in hours) of a particular piece of electrical equipment can be characterized by the probability density function f (x) = - 1 200 e−x/200, x > 0;0 elsewhere.Verify
4-32. The cumulative distribution function for a continuous random variable X is F(t) = -1 − e−t/θ , θ > 0, t ≥ 0;0 t < 0.Demonstrate that θ is the mean of X.
4-31. Evaluate the expressionE [(aX + b)n] =ni=0n ian−ibiE(Xn−i) for n = 1, 2.
4-30. Comment on the following statement: For X a continuous random variable with probability density function f (x), the existence of E(X) implies that 1 0−∞xf (x)dx, 1+∞0 xf (x)dx,
4-29. Suppose f (x) = -x−2, 1≤ x < +∞;0 elsewhere.Does E(x) exist?
4-28. For the probability density function f (x) = λ−1e−x/λ, λ > 0, 0 ≤ x < ∞, demonstrate that E(X) = λ and V(X) = λ2.
4-27. A fair coin is tossed three times in succession (or three fair coins are tossed once). List the points or simple events in the sample space S. Determine the associated probability distribution.
4-26. Given the probability density function f (x) = -2(1 − x), 0 ≤ x ≤ 1;0 elsewhere find:(a) E(X)(b) V(X)
4-25. Given the following discrete probability distribution, find:(a) E(X)(b) V(X)X 1 5 7 9 f (X) 1/6 2/6 2/6 1/6
4-24. Suppose a continuous random variable has a probability density function of the form f (x) = -e−x, x > 0;0 elsewhere.Find P(X2 < θ),θ > 0.
4-23. For the probability density function f (x) = - 1 200 e−x/200, x > 0;0 elsewhere.find the associated cumulative distribution function. Use the latter to determine:(a) P(X > 80)(b) P(50 < X ≤
4-22. Suppose a probability density function has the form f (x) =⎧⎪⎪⎨⎪⎪⎩40+x 1600 , −40 < x < 0;40−x 1600 0 ≤ x < 40;0 elsewhere.For A = {−10 ≤ X ≤ 10}, find P(A). (Hint:
4-21. Do you agree or disagree with statements (a)–(f)? Given that the indicated limits exist:(a) 4+∞a f (x)dx = limb→+∞ 4 b a f (x)dx(b) 4 b−a f (x)dx = lima→+∞ 4 b−a f (x)dx(c)
4-20. Verify that the cumulative distribution function given by (4.9):(a) has values restricted to [0,1](b) is monotone nondecreasing and continuous in t, with its derivatives existing at every point
4-19. Verify that the cumulative distribution function given by (4.3):(a) has values restricted to [0, 1](b) is monotone nondecreasing(c) is continuous from the right(d) defines P(Xs < X ≤ Xt) =
Showing 300 - 400
of 5580
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Last