All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
elementary probability for applications
Questions and Answers of
Elementary Probability For Applications
Ai occurs for 2 ≤ i ≤ n, prove that E(Un) = (n −1)pq, and find the variance of Un. (Oxford 1977F)
15. Let (Zn : 1 ≤ n < ∞) be a sequence of independent, identically distributed random variables with P(Zn = 0) = q, P(Zn = 1) = p, where p+q = 1. Let Ai be the event that Zi = 0 and Zi−1 = 1.
14. Each time you flip a certain coin, heads appears with probability p. Suppose that you flip the coin a random number N of times, where N has the Poisson distribution with parameter λ and is
13. In Problem 3.6.12 above, find the expected number of different types of coupon in the first n coupons received.
12. Coupon-collecting problem. There are c different types of coupon, and each coupon obtained is equally likely to be any one of the c types. Let Yi be the additional number of coupons collected,
(b) the expected number of triangles (triples of points each pair of which is joined by an edge) is 1 6 n(n − 1)(n − 2) p3.
(a) the expected number of edges in the random graph is 1 2n(n − 1)p,
Show that the expected number of empty boxes is (M − 1)N /MN−1.11. We are provided with a coin which comes up heads with probability p at each toss. Let v1, v2, . . . , vn be n distinct points on
(c) Find the probability that x2 + (U + V)x + U + V = 0 has at least one real root.(Oxford 1980M)10. A number N of balls are thrown at random into M boxes, with multiple occupancy permitted.
(b) Find the expected value of the larger root, given that there is at least one real root.
(a) Find the probability that x2 + Ux + V = 0 has at least one real root.
9. The random variables U and V each take the values ±1. Their joint distribution is given by P(U = +1) = P(U = −1) = 1 2 , P(V = +1 | U = 1) = 1 3 = P(V = −1 | U = −1), P(V = −1 | U = 1) =
8. Let X1, X2, . . . be independent, identically distributed random variables, and Sn = X1+X2+· · · + Xn. Show that E(Sm/Sn) = m/n if m ≤ n, and E(Sm/Sn) = 1 + (m − n)μE(1/Sn) if m > n, where
7. Let X1, X2, . . . be discrete random variables, each having mean μ, and let N be a random variable which takes values in the non-negative integers and which is independent of the Xi .By
6. Hugo’s bowl of spaghetti contains n strands. He selects two ends at random and joins them. He does this until no ends are left. What is the expected number of spaghetti hoops in his bowl?
5. Let X and Y be independent discrete random variables, X having the geometric distribution with parameter p and Y having the geometric distribution with parameter r . Show that U = min{X, Y } has
Find the mass functions of Un and Vn, given by Un = min{X1, X2, . . . , Xn}, Vn = max{X1, X2, . . . , Xn}.
4. Let X1, X2, . . . , Xn be independent discrete random variables, each having mass function P(Xi = k) =1 Nfor k = 1, 2, . . . , N.
3. If X and Y are discrete random variables, each taking only two distinct values, prove that X and Y are independent if and only if E(XY ) = E(X)E(Y ).
2. Independent random variables U and V each take the values −1 or 1 only, and P(U = 1) =a, P(V = 1) = b, where 0
1 P(Ai ). This is, however, harder to prove. See the footnote on p. 40.48 Multivariate discrete distributions and independence
where the two terms correspond to whether or not the second queen sits next to the first couple.By (3.39)–(3.41), E(N2) = 2 + n(n − 1) ·2(2n − 3)n(n − 1)2, and hence var(N) = E(N2) − E(N)2
. (3.39)Now 12 Ai = 1Ai , since an indicator function takes only the values 0 and 1, and also 1Ai 1Aj =1Ai∩Aj . Therefore, by symmetry, E(N2) = EX i1Ai + 2 Xi
Example 3.37 The 2n seats around a circular table are numbered clockwise. The guests at dinner form n king/queen pairs. The queens sit at random in the odd-numbered seats, with the kings at random
Exercise 3.25 If X and Y are independent discrete random variables, show that the two random variables g(X) and h(Y ) are independent also, for any functions g and h which map R into R.
Show that two events A and B are independent if and only if their indicator functions are independent random variables.
Exercise 3.23 Let X and Y be independent discrete random variables. Prove that P(X ≥ x and Y ≥ y) = P(X ≥ x)P(Y ≥ y)for all x, y ∈ R.Exercise 3.24 The indicator function of an event A is
It is easy to find a probability space (,F , P), together with two random variables having these distributions. For example, take = {−1, 0, 1},F the set of all subsets of , P given by P(−1) =
Example 3.22 Suppose that X has distribution given by P(X = −1) = P(X = 0) = P(X = 1) = 1 3and Y is given by Y =(0 if X = 0, 1 if X 6= 0.
Exercise 3.9 The pair of discrete random variables (X, Y) has joint mass function P(X = i, Y = j ) =(θi+j+1 if i, j = 0, 1, 2, 0 otherwise, for some value of θ. Show that θ satisfies the
Exercise 3.8 Two cards are drawn at random from a deck of 52 cards. If X denotes the number of aces drawn and Y denotes the number of kings, display the joint mass function of X and Y in the tabular
Similar ideas apply to families X = (X1, X2, . . . , Xn) of discrete random variables on a probability space. For example, the joint mass function of X is the function pX defined by pX(x) = P(X1 =
Example 3.7 Suppose that X and Y are random variables each taking the values 1, 2, or 3, and that the probability that the pair (X, Y ) equals (x, y) is given in Table 3.1 for all relevant values of
2.6 Problems 37 where m ≤ n ≤ N − a + m. Hence, show that aNa − 1 m − 1(N − a)!(N − 1)!N−Xa+m n=m(n − 1)! (N − n)!(n − m)! (N − a + m − n)! = 1, and that the expectation
10. A population of N animals has had a certain number a of its members captured, marked, and then released. Show that the probability Pn that it is necessary to capture n animals in order to obtain
9. The probability of obtaining a head when a certain coin is tossed is p. The coin is tossed repeatedly until n heads occur in a row. Let X be the total number of tosses required for this to happen.
* 8. An ambidextrous student has a left and a right pocket, each initially containing n humbugs.Each time he feels hungry, he puts a hand into one of his pockets and, if it is not empty, he takes a
7. Coupon-collecting problem. There are c different types of coupon, and each coupon obtained is equally likely to be any one of the c types. Find the probability that the first n coupons which you
2 . (Oxford 1979M)
A fair die having two faces coloured blue, two red and two green, is thrown repeatedly. Find the probability that not all colours occur in the first k throws.Deduce that, if N is the random variable
We say that X has the ‘lack-of-memory property’ since, if we are given that X − m > 0, then the distribution of X − m is the same as the original distribution of X. Show that the geometric
5. Lack-of-memory property. If X has the geometric distribution with parameter p, show that P????X > m + n X > m= P(X > n)for m, n = 0, 1, 2, . . . .
4. For what values of c and α is the function p, defined by p(k) =(ckα for k = 1, 2, . . . , 0 otherwise, a mass function?
3. If X is a discrete random variable and E(X2) = 0, show that P(X = 0) = 1. Deduce that, if var(X) = 0, then P(X = μ) = 1, whenever μ = E(X) is finite.
2. Each toss of a coin results in heads with probability p (> 0). If m(r ) is the mean number of tosses up to and including the r th head, show that m(r ) = p1 + m(r − 1)+ (1 − p)1 + m(r )for
1. If X has the Poisson distribution with parameter λ, show that E????X(X − 1)(X − 2) · · · (X − k)= λk+1 for k = 0, 1, 2, . . . .
What is the
show that the Poisson distribution has variance equal to its mean.
Exercise 2.39 Find E(X) and E(X2) when X has the Poisson distribution with parameter λ, and hence
Exercise 2.38 Show that var(aX +b) = a2 var(X) fora, b ∈ R.
Exercise 2.37 If X has the binomial distribution with parameters n and p = 1 − q, show that E(X) = np, E(X2) = npq + n2 p2, and deduce the variance of X.
2.5 Conditional expectation and the partition theorem 33 Example 2.36 If X has the geometric distribution with parameter p (= 1 − q), the mean of X is E(X) =∞X k=1 kpqk−1=p(1 − q)2 =1 p, and
We note that, by Theorem 2.29, var(X) =X x∈Im X(x − μ)2P(X = x), (2.34)where μ = E(X). A rough motivation for this definition is as follows. If the dispersion of X about its expectation is very
The expectation E(X) of a discrete random variable X is an indication of the ‘centre’ of the distribution of X. Another important quantity associated with X is the ‘variance’ of X, and this
Example 2.31 Suppose that X is a random variable with the Poisson distribution, parameterλ, and we wish to find the expected value of Y = eX . Without Theorem 2.29 we would have to find the mass
Here is an example of Theorem 2.29 in action.32 Discrete random variables
if the last sum converges absolutely. 2 Two simple but useful properties of expectation are as follows.Theorem 2.30 Let X be a discrete random variable and leta, b ∈ R.(a) If P(X ≥ 0) = 1 and
whenever this sum converges absolutely.Intuitively, this result is rather clear, since g(X) takes the value g(x) when X takes the value x, an event which has probability P(X = x). A more formal proof
If X is a discrete random variable (on some probability space) and g : R → R, then Y = g(X) is a discrete random variable also. According to the above definition, we need to know the mass function
and the expectation of X is often called the expected value or mean of X.3 The reason for requiring absolute convergence in (2.28) is that the image Im X may be an infinite set, and we 3One should be
Equation (2.28) is often written E(X) =X xxP(X = x) =X xx pX(x),
whenever this sum converges absolutely, in that Px |xP(X = x)| < ∞.
which we call the mean value. This notion of mean value is easily extended to more general distributions as follows.Definition 2.27 If X is a discrete random variable, the expectation of X is denoted
Exercise 2.26 Let X be a discrete random variable having the Poisson distribution with parameter λ, and let Y = | sin( 1 2π X)|. Find the mass function of Y .2.4 Expectation Consider a fair die. If
since there are only countably many non-zero contributions to this sum. Thus, if Y = aX +b with a 6= 0, then P(Y = y) = P(aX + b = y) = P????X = a−1(y − b)for y ∈ R, while if Y = X2, then P(Y
Simple examples are if g(x) = ax + b then g(X) = aX + b, if g(x) = cx2 then g(X) = cX2.30 Discrete random variables If Y = g(X), the mass function of Y is given by pY (y) = P(Y = y) = P(g(X) = y)=
2.3 Functions of discrete random variables Let X be a discrete random variable on the probability space (,F , P) and let g : R→ R. It is easy to check that Y = g(X) is a discrete random variable
Exercise 2.24 If X is a discrete random variable having the geometric distribution with parameter p, show that the probability that X is greater than k is (1 − p)k .
Exercise 2.23 If X is a discrete random variable having the Poisson distribution with parameter λ, show that the probability that X is even is e−λ cosh λ.
2.22 If we carry on tossing the coin in the previous example until the nth head has turned up, then a similar argument shows that, if p ∈ (0, 1), the total number of tosses required has the
Let Y be the total number of tosses in this experiment, so that Y(TkH) = k+1 for 0 ≤ k < ∞and Y(T∞) = ∞. If p > 0, then P(Y = k) = P(Tk−1H) = pqk−1 for k = 1, 2, . . . , showing that Y
2.3 Functions of discrete random variables 29 10−5. It may be easier (and not too inaccurate) to use (2.20) rather than (2.19) to calculate probabilities. In this case, λ = np = 10 and so, for
This approximation may be useful in practice. For example, consider a single page of the Guardian newspaper containing, say, 106 characters, and suppose that the typesetter flips a coin before
If n is very large and p is very small but np is a ‘reasonable size’ (np = λ, say) then the distribution of Sn may be approximated by the Poisson distribution with parameter λ, as follows. For
pkqn−k , (2.19)and so Sn has the binomial distribution with parameters n and p.
which is to say that Sn(ω) = X1(ω) + X2(ω) + · · · + Xn(ω). Clearly, Sn is the total number of heads which occur, and Sn takes values in {0, 1, . . . , n} since each Xi equals 0 or 1. Also,
Hence, each Xi has the Bernoulli distribution with parameter p. We have derived this fact in a cumbersome manner, but we believe these details to be instructive.Let Sn = X1 + X2 + · · · + Xn,
where ωi is the i th entry in ω. Thus 28 Discrete random variables P(Xi = 0) =Xω: ωi=T ph(ω)qn−h(ω)=Xn−1 h=0 Xω: ωi=T, h(ω)=h phqn−h =Xn−1 h=0n − 1 hphqn−h= q(p + q)n−1 = q
P(ω) = ph(ω)qt (ω), where h(ω) is the number of heads in ω and t (ω) = n−h(ω) is the number of tails. Similarly, for any A ∈ F , P(A) =Xω∈A P(ω).For i = 1, 2, . . . , n, we define the
Example 2.18 Here is an example of some of the above distributions in action. Suppose that a coin is tossed n times and there is probability p that heads appears on each toss. Representing heads by H
As before, note that∞X k=nk − 1 n − 1pnqk−n = pn ∞X l=0n +l − 1 lql where l = k − n= pn ∞X l=0−n l(−q)l= pn(1 − q)−n = 1, using the binomial expansion of (1 − q)−n,
Negative binomial distribution. We say that X has the negative binomial distribution with parameters n and p ∈ (0, 1) if X takes values in {n, n + 1, n + 2, . . . } and P(X = k) =k − 1 n −
Geometric distribution. We say that X has the geometric distribution with parameter p ∈(0, 1) if X takes values in {1, 2, 3, . . . } and P(X = k) = pqk−1 for k = 1, 2, 3, . . . . (2.16)As before,
λke−λ for k = 0, 1, 2, . . . . (2.15)Again, this gives rise to a mass function since∞X k=0 1k!λke−λ = e−λ ∞X k=0 1k!λk = e−λeλ = 1.
Poisson distribution. We say that X has the Poisson distribution with parameter λ (> 0) if X takes values in {0, 1, 2, . . . } and P(X = k) =1 k!
Note that (2.14) gives rise to a mass function satisfying (2.6) since, by the binomial theorem, Xn k=0n kpkqn−k = (p + q)n = 1.
Binomial distribution. We say that X has the binomial distribution with parameters n and p if X takes values in {0, 1, . . . , n} and P(X = k) =n kpkqn−k for k = 0, 1, 2, . . . , n. (2.14)
Coin tosses are the building blocks of probability theory. There is a sense in which the entire theory can be constructed from an infinite sequence of coin tosses.
pX (0) = q, pX (1) = p, pX(x) = 0 if x 6= 0, 1.
2.2 Examples Certain types of discrete random variables occur frequently, and we list some of these.Throughout this section, n is a positive integer, p is a number in [0, 1], and q = 1 − p.We never
Exercise 2.12 For what value of c is the function p, defined by p(k) =c k(k + 1)if k = 1, 2, . . . , 0 otherwise, a mass function?26 Discrete random variables
1 if ω is even, 0 if ω is odd, W(ω) = ω2, for ω ∈ . Determine which of U, V, W are discrete random variables on the probability space.
Exercise 2.11 Let (,F , P) be a probability space in which = {1, 2, 3, 4, 5, 6}, F =∅, {2, 4, 6}, {1, 3, 5},, and let U, V, W be functions on defined by U(ω) = ω, V(ω) =(
Exercise 2.10 If E is an event of the probability space (,F , P) show that the indicator function of E, defined to be the function 1E on given by 1E (ω) =(1 if ω ∈ E, 0 if ω /∈ E, is a
Exercise 2.9 Show that if F is the power set of , then all functions which map into a countable subset of R are discrete random variables.
Exercise 2.8 If X and Y are discrete random variables on the probability space (,F , P), show that U and V are discrete random variables on this space also, where U(ω) = X(ω) + Y (ω), V(ω) =
19. There are n socks in a drawer, three of which are red and the rest black. John chooses his socks by selecting two at random from the drawer, and puts them on. He is three times more likely to
i P(Ai ) for all sequences A1, A2, . . . of disjoint events.
* 18. Show that the axiom that P is countably additive is equivalent to the axiom that P is finitely additive and continuous. That is to say, let be a set and F an event space of subsets of . If P
17. A coin is tossed repeatedly; on each toss a head is shown with probability p, or a tail with probability 1− p. The outcomes of the tosses are independent. Let E denote the event that the first
Showing 500 - 600
of 2468
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Last