All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
theory of probability
Questions and Answers of
Theory Of Probability
82. Let X1,X2, . . . be independent continuous random variables with a common distribution function F and density f = F, and for k 1 let Nk = min{n k: Xn = kth largest of X1, . . . ,Xn}Hint: Use
83. An urn contains n balls, with ball i having weight wi , i = 1, . . . , n. The balls are withdrawn from the urn one at a time according to the following scheme: When S is the set of balls that
84. In the list example of Section 3.6.1 suppose that the initial ordering at time t = 0 is determined completely at random; that is, initially all n! permutations are equally likely. Following the
85. In the list problem, when the Pi are known, show that the best ordering (best in the sense of minimizing the expected position of the element requested) is to place the elements in decreasing
86. Consider the random graph of Section 3.6.2 when n = 5. Compute the probability distribution of the number of components and verify your solution by using it to compute E[C] and then comparing
87. (a) From the results of Section 3.6.3 we can conclude that there aren+m−1 m−1nonnegative integer valued solutions of the equation x1 + · · · + xm = n.Prove this directly.(b) How many
88. In Section 3.6.3, we saw that if U is a random variable that is uniform on(0, 1) and if, conditional on U = p,X is binomial with parameters n and p, then P{X = i} = 1 n + 1, i = 0, 1, . . . , n
89. Let I1, . . . , In be independent random variables, each of which is equally likely to be either 0 or 1. A well-known nonparametric statistical test (called the signed rank test) is concerned
90. The number of accidents in each period is a Poisson random variable with mean 5.With Xn, n 1, equal to the number of accidents in period n, find E[N] when(a) N = min(n: Xn−2 = 2,Xn−1 = 1,Xn
91. Find the expected number of flips of a coin, which comes up heads with probability p, that are necessary to obtain the pattern h, t, h, h, t, h, t, h.
92. The number of coins that Josh spots when walking to work is a Poisson random variable with mean 6. Each coin is equally likely to be a penny, a nickel, a dime, or a quarter. Josh ignores the
93. Consider a sequence of independent trials, each of which is equally likely to result in any of the outcomes 0, 1, . . . ,m. Say that a round begins with the first trial, and that a new round
94. Let N be a hypergeometric random variable having the distribution of the number of white balls in a random sample of size r from a set of w white and b blue balls.That is,where we use the
95. For the left skip free random walk of Section 3.6.6 let β = P(Sn 0 for all n) be the probability that the walk is never positive. Find β when E[Xi] < 0.
96. Consider a large population of families, and suppose that the number of children in the different families are independent Poisson random variables with mean λ. Show that the number of siblings
80. A coin that comes up heads with probability p is flipped n consecutive times. What is the probability that starting with the first flip there are always more heads than tails that have appeared?
79. An urn contains n white and m black balls that are removed one at a time. If n > m, show that the probability that there are always more white than black balls in the urn (until, of course, the
63. Suppose there are n types of coupons, and that the type of each new coupon obtained is independent of past selections and is equally likely to be any of the n types.Suppose one continues
64. A and B roll a pair of dice in turn, with A rolling first. A’s objective is to obtain a sum of 6, and B’s is to obtain a sum of 7. The game ends when either player reaches his or her
65. The number of red balls in an urn that contains n balls is a random variable that is equally likely to be any of the values 0, 1, . . . , n. That is,The n balls are then randomly removed one at a
66. The opponents of soccer team A are of two types: either they are a class 1 or a class 2 team. The number of goals team A scores against a class i opponent is a Poisson random variable with mean
67. A coin having probability p of coming up heads is continually flipped. Let Pj(n)denote the probability that a run of j successive heads occurs within the first n flips.(a) Argue that Pj(n) = Pj(n
68. In a knockout tennis tournament of 2n contestants, the players are paired and play a match. The losers depart, the remaining 2n−1 players are paired, and they play a match. This continues for n
69. In the match problem, say that (i, j), i < j, is a pair if i chooses j’s hat and j chooses i’s hat.(a) Find the expected number of pairs.(b) Let Qn denote the probability that there are no
70. Let N denote the number of cycles that result in the match problem.(a) Let Mn = E[N], and derive an equation for Mn in terms of M1, . . . ,Mn−1.(b) Let Cj denote the size of the cycle that
71. Use Equation (3.14) to obtain Equation (3.10).Hint: First multiply both sides of Equation (3.14) by n, then write a new equation by replacing n with n − 1, and then subtract the former from the
72. In Example 3.28 show that the conditional distribution of N given that U1 = y is the same as the conditional distribution of M given that U1 = 1 − y. Also, show that E[N|U1 = y] = E[M|U1 = 1
73. Suppose that we continually roll a die until the sum of all throws exceeds 100. What is the most likely value of this total when you stop? 2 3 5
74. There are five components. The components act independently, with component i working with probability pi , i = 1, 2, 3, 4, 5. These components form a system as shown in Figure 3.7.The system is
75. This problem will present another proof of the ballot problem of Example 3.27.(a) Argue that Pn,m = 1 − P{A and B are tied at some point}(b) Explain why P{A receives first vote and they are
76. Consider a gambler who on each bet either wins 1 with probability 18/38 or loses 1 with probability 20/38. (These are the probabilities if the bet is that a roulette wheel will land on a
77. Show that(a) E[XY|Y = y] = yE[X|Y = y](b) E[g(X, Y)|Y = y] = E[g(X, y)|Y = y](c) E[XY] = E[YE[X|Y]]
. In the ballot problem (Example 3.27), compute P{A is never behind}.
97. Use the conditional variance formula to find the variance of a geometric random variable.
24. A coin, having probability p of landing heads, is continually flipped until at least one head and one tail have been flipped.(a) Find the expected number of flips needed.(b) Find the expected
60. Calculate the moment generating function of the uniform distribution on (0, 1).Obtain E[X] and Var[X] by differentiating.
79. With K(t) = log(E"etX#), show that K(0) = E[X], K(0) = Var(X)
80. Let X denote the number of the events A1, . . . ,An, that occur. Express E[X], Var(X), and E&$Xk%'in terms of the quantities Sk = i1
Suppose that p(x, y), the joint probability mass function of X and Y, is given by p(1, 1) = 0.5, p(1, 2) = 0.1, p(2, 1) = 0.1, p(2, 2) = 0.3 Calculate the conditional probability mass function of X
If X1 and X2 are independent binomial random variables with respective parameters (n1, p) and (n2, p), calculate the conditional probability mass function of X1 given that X1 + X2 = m.
If X and Y are independent Poisson random variables with respective means λ1 and λ2, calculate the conditional expected value of X given that X + Y = n.
Consider an experiment that results in one of three possible outcomes with outcome i occurring with probability pi, i = 1, 2, 3,3i=1 pi = 1.Suppose that n independent replications of this experiment
There are n components. On a rainy day, component i will function with probability pi; on a nonrainy day, component i will function with probability qi, for i = 1, . . . , n. It will rain tomorrow
Suppose the joint density of X and Y is given byCompute the conditional expectation of X given that Y = y, where 0 16xy(2-x-y), 0
Suppose the joint density of X and Y is given byCompute E[X|Y = y]. [4y(xy)e-(x+y), 0 < x < ,0 y x f(x, y) = 10, otherwise
The joint density of X and Y is given byWhat is E[eX/2|Y = 1]? ye-xy, f(x, y) = 0 < x < , 0 < y < 2 otherwise
Let X1 and X2 be independent exponential random variables with rates μ1 and μ2. Find the conditional density of X1 given that X1 + X2 = t.
Sam will read either one chapter of his probability book or one chapter of his history book. If the number of misprints in a chapter of his probability book is Poisson distributed with mean 2 and if
Example 3.11 (The Expectation of the Sum of a Random Number of Random Variables) Suppose that the expected number of accidents per week at an industrial plant is four. Suppose also that the numbers
A coin, having probability p of coming up heads, is to be successively flipped until the first head appears.What is the expected number of flips required?
A miner is trapped in a mine containing three doors. The first door leads to a tunnel that takes him to safety after two hours of travel. The second door leads to a tunnel that returns him to the
Suppose in Example 2.31 that those choosing their own hats depart, while the others (those without a match)put their selected hats in the center of the room, mix them up, and then reselect.Also,
78. Let φ(t1, . . . , tn) denote the joint moment generating function of X1, . . . ,Xn.(a) Explain how the moment generating function of Xi , φXi (ti), can be obtained from φ(t1, . . . , tn).(b)
77. Let X and Y be independent normal random variables, each having parameters μand σ2. Show that X + Y is independent of X − Y.Hint: Find their joint moment generating function.
61. Let X andW be the working and subsequent repair times of a certain machine. Let Y = X + W and suppose that the joint probability density of X and Y is fX,Y(x, y) = λ2e−λy, 0< x < y < ∞(a)
62. In deciding upon the appropriate premium to charge, insurance companies sometimes use the exponential principle, defined as follows. With X as the random amount that it will have to pay in
63. Calculate the moment generating function of a geometric random variable.
64. Show that the sum of independent identically distributed exponential random variables has a gamma distribution.
65. Consider Example 2.48. Find Cov(Xi , Xj) in terms of the ars.
66. Use Chebyshev’s inequality to prove the weak law of large numbers. Namely, if X1, X2, . . . are independent and identically distributed with mean μ and varianceσ2 then, for any ε > 0, | X
67. Suppose that X is a random variable with mean 10 and variance 15. What can we say about P{5 < X < 15}?
68. Let X1, X2, . . . ,X10 be independent Poisson random variables with mean 1.(a) Use the Markov inequality to get a bound on P{X1 + · · · + X10 ≥ 15}.(b) Use the central limit theorem to
69. If X is normally distributed with mean 1 and variance 4, use the tables to find P{2 < X < 3}.
70. Show thatHint: Let Xn be Poisson with mean n. Use the central limit theorem to show that P{Xn ≤ n} → 12 . lim 11-00 IM= k=0 nk k! 1
71. Let X denote the number of white balls selected when k balls are chosen at random from an urn containing n white and m black balls.(a) Compute P{X = i}.(b) Let, for i = 1, 2, . . . , k; j = 1, 2,
72. Show that Var(X) = 1 when X is the number of men who select their own hats in Example 2.31.
73. For the multinomial distribution (Exercise 17), let Ni denote the number of times outcome i occurs. Find(a) E[Ni];(b) Var(Ni);(c) Cov(Ni , Nj);(d) Compute the expected number of outcomes that do
74. Let X1, X2, . . . be a sequence of independent identically distributed continuous random variables.Wesay that a record occurs at time n ifXn > max(X1, . . . ,Xn−1).That is, Xn is a record if it
75. Let a1 < a2 < · · · < an denote a set of n numbers, and consider any permutation of these numbers. We say that there is an inversion of ai and aj in the permutation if i < j and aj precedes ai
76. Let X and Y be independent random variables with means μx and μy and variancesσ2 x and σ2 y . Show that 2.2 Var(XY)=0++
Independent trials, each of which is a success with probability p, are performed until there are k consecutive successes. What is the mean number of necessary trials?
In the match problem of Example 2.31 involving n, n > 1, individuals, find the conditional expected number of matches given that the first person did not have a match.
Independent trials, each resulting in a success with probability p, are performed in sequence. Let N be the trial number of the first success. Find Var(N).
7. Suppose p(x, y, z), the joint probability mass function of the random variables X, Y, and Z, is given byWhat is E[X|Y = 2]? What is E[X|Y = 2,Z = 1]? p(1, 1, 1), p(1,1,2), p(2,1,1) = 1, p(2,1,2) =
8. An unbiased die is successively rolled. Let X and Y denote, respectively, the number of rolls necessary to obtain a six and a five. Find (a) E[X],(b) E[X|Y = 1], (c) E[X|Y = 5].
9. Show in the discrete case that if X and Y are independent, then E[X|Y = y] = E[X] for all y
10. Suppose X and Y are independent continuous random variables. Show that E[X|Y = y] = E[X] for all y
11. The joint density of X and Y isShow that E[X|Y = y] = 0. f(x,y) = (y-x2) 8 -e-, 0 < y < , -y
12. The joint density of X and Y is given byShow E[X|Y = y] = y. e-x/ye-y f(x,y) = 0
13. Let X be exponential with mean 1/λ; that is,Find E[X|X > 1]. fx(x)=xex, 0 < x
14. Let X be uniform over (0, 1). Find E[X|X < 12].
15. The joint density of X and Y is given byCompute E[X2|Y = y]. f(x,y) = e-y y 0
16. The random variables X and Y are said to have a bivariate normal distribution if their joint density function is given by(a) Show that X is normally distributed with mean μx and variance σ2 x ,
17. Let Y be a gamma random variable with parameters (s, α). That is, its density iswhere C is a constant that does not depend on y. Suppose also that the conditional distribution of X given that Y
18. Let X1, . . . ,Xn be independent random variables having a common distribution function that is specified up to an unknown parameter θ. Let T = T(X) be a function of the data X = (X1, . . .
19. Prove that if X and Y are jointly continuous, then E[X] = -00 EIXIY ylfy (y) dy E
20. An individual whose level of exposure to a certain pathogen is x will contract the disease caused by this pathogen with probability P(x). If the exposure level of a randomly chosen member of the
21. Consider Example 3.13, which refers to a miner trapped in a mine. Let N denote the total number of doors selected before the miner reaches safety. Also, let Ti denote the travel time
22. Suppose that independent trials, each of which is equally likely to have any of m possible outcomes, are performed until the same outcome occurs k consecutive times. If N denotes the number of
6. Repeat Exercise 5 but under the assumption that when a ball is selected its color is noted, and it is then replaced in the urn before the next selection is made.
5. An urn contains three white, six red, and five black balls. Six of these balls are randomly selected from the urn. Let X and Y denote respectively the number of white and black balls selected.
Let X1, X2, . . . be independent and identically distributed random variables with distribution F having mean μ and variance σ2, and assume that they are independent of the nonnegative integer
Suppose that X and Y are independent continuous random variables having densities fX and fY, respectively. Compute P{X < Y}.
An insurance company supposes that the number of accidents that each of its policyholders will have in a year is Poisson distributed, with the mean of the Poisson depending on the policyholder. If
Suppose that the number of people who visit a yoga studio each day is a Poisson random variable with mean λ. Suppose further that each person who visits is, independently, female with probability p
Suppose that we are to be presented with n distinct prizes in sequence. After being presented with a prize we must immediately decide whether to accept it or reject it and consider the next prize.The
At a party n men take off their hats. The hats are then mixed up and each man randomly selects one. We say that a match occurs if a man selects his own hat. What is the probability of no matches?
In an election, candidate A receives n votes, and candidate B receives m votes where n > m. Assuming that all orderings are equally likely, show that the probability that A is always ahead in the
Let U1,U2, . . . be a sequence of independent uniform (0, 1) random variables, and let N = min{n 2: Un > Un−1}and M = min{n 1: U1 + · · · + Un > 1}That is, N is the index of the first
Let X1,X2, . . . be independent continuous random variables with a common distribution function F and density f = F, and suppose that they are to be observed one at a time in sequence. Let N = min{n
Showing 2700 - 2800
of 6259
First
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
Last